What is the "true" measure of cpu performance?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
That's about right, but here's another way to think of it:

You have a bathtub full of water, and the water must be moved from the bathtub and poured down the sink. There are two buckets available: a green one that is large but heavier and therefore slower to move, or a blue one that's small and doesn't carry as much, but you can use it a lot more in the same amount of time.

<font color=blue>If you don't buy Windows, then the terrorists have already won!</font color=blue> - Microsoft
 
Your right, I was ranting against the whole benchmarking, IPC vs. MHz, DDR vs. RDRAM BS that has been going on at this board lately. Suffice to say, neither CPU has any real advantage over the other (unless you <i>need</i> Q3, at lowest settings, to run at 250FPS).

Actually, the whole point is that there is no "true" way. It's absurd to try. No one out there can notice anything less than a 7-10% increase in performance, in which case, all you're proving is that in that app, CPU A has it good over CPU B. So, really, only those benches that have that kind of difference should be shown, so that you can see that CPU A has an advantage in encoding and some types of gaming, while CPU B has an advantage in content creation and mulit-tasking. That's all I care about, and you can also throw in, for the overclockers out there, how much of an overclock can be obtained.

-SammyBoy