'how to calculate throughput of uniform delay pipeline

I have tried to understand the calculation of throughput of a pipeline.

I got it from book

Throughput = 1 instruction/(20+300) picoseconds . 1000 picoseconds/ 1 nanoseconds

       = 3.12 GIPS

Now, 1 nanoseconds = 1000 picoseconds

As per my understanding

Throughput = 1 instruction/(320) picoseconds . 1000 picoseconds/1000 nanoseconds

Throughput = 1 instruction / 320 picoseconds

Which is not equal to 3.14 GIPS

How do I calculate it?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source