"You can determine CPU bottlenecks by either playing on very low resolutions or high resolutions. By maxing out the GPU capabilities then adjusting or upgrading the CPU, you are able determine if it's a cpu bottleneck."
Adjusting being overclocking then, which according to you would be near pointless for gaming. Your saying if matey wants a gtx260 then he wont get any better an experience with a q9400 than with a core 2 2ghz. Which is most definetly FUD.
There are any number of games out there which are as heavy on the subsystem as on the gpu, the cpu being part of that system. What ive said comes from MY experience with a fast card and a slow cpu and what happens when I overclock said cpu. Ive tried gaming at the stock 1.86ghz instead of at the max oc. of 2.87ghz, and let me tell you: IT WAS NOT PRETTY. The rest of my system not good enough perhaps? Certainly. 4 gigs ddr2-800 @820mhz, samsung 500gb HD, win xp. Defragged, clean, msconfig to turn off all the crap.
Games list:
Crysis,
Crysis: warhead,
Supreme Commander,
S.T.A.L.K.E.R SoC,
S.T.A.L.K.E.R CS,
GRiD,
Farcry 2,
Fallout 3,
Battlefield 2,
Left 4 Dead.
Apart from the last two, all of them took an absolute hammering from the lack of 1,000+ mhz. L4D might have, but fps is always going to be sky high with such an old engine, with a 4870 @1400x900 max. res. As for BF2, ive been on 90+fps since i had a 7900gt.
The new Total War demo didnt appreciate sh@tty stock clocks either.
I'll get a photobucket account and post proof to dispel your FUD once and for all.
Back in the G80 days you would have had a point.