[citation][nom]belardo[/nom]Look at an overclocked fx-8150 running at 4.6~4.8Ghz, apply that performance to a 4.0Ghz FX-8350 (if it is running 4 Ghz)..[/citation]
Wrong. Allow me to explain:
Pair the best cpu available today with the worst graphics card, and you have a "bottleneck" in gaming, correct? Now overclock the cpu to 1284048ghz, the gpu will still bottleneck and fps will remain largely unchanged, correct?
Now, there are many factors in a cpu that determines its speed other than clock rate, and most of those can't be changed. Now if for example the cache isn't "optimized" so the cpu has to wait for data from the next level cache or, god forbid, from the RAM, then the cpu is missing a ton of clock cycles until the data is available. Now if you overclock, the cpu will get it done faster, but this doesn't change the fact that it waited a ton of time for the data to arrive.
Now, in an overclock, you can get a hint of an efficient architecture by observing speed gains by overclocking. If it's a linear / almost linear gain, the architecture is very efficient, and can provide data very fast to the execution units. If the gains are sub-linear (some / most of the time) then the architecture could use some more optimizations - Such is the case of Bulldozer.
So don't let the 10-15 percent gains fool you - If it's in the right places, we may see more performance out of an overclock AND a more consistent performance (other than high FP workloads, which is a disadvantage inherent to the design, and will probably change when FP units will be replaced by a gpu block in future generations).