What is a Giga-Flop?

Stennersaurus

Honorable
Nov 4, 2012
201
0
10,710
So, I just overclocked my CPU, an i7 3770k to 4.2Ghz from 3.5Ghz. I did a stress test before I overclocked it and after. Before I did it, it took about 30 seconds to run each test (I ran 10 tests) and each result was about the same: 36 GFlops.

After I overclocked it, I did exactly the same test, but it took half the time, and return twice as many GFlops (about 70) for each test. That means it was twice as quick to do the test, and the GFlops count is now twice as high.

My question is, what does that mean? Obviously my PC isn't now twice as fast, as I only increased the Ghz by about 20%. Can someone explain what this means, and also the real-world implications of having a higher GFlops count.
 
Solution
A "FLOP" is a floating point operation. A Giga-FLOP is 1 billion floating point operations. Depending on the benchmark, the code that is running might fit into the on-chip cache. This is probably why the overclock had such a great impact on the number.
A "FLOP" is a floating point operation. A Giga-FLOP is 1 billion floating point operations. Depending on the benchmark, the code that is running might fit into the on-chip cache. This is probably why the overclock had such a great impact on the number.
 
Solution