GPU vs. CPU Upgrade: Extensive Tests

leo2kp

Distinguished
This is an endless topic of conversation, with everybody you meet having their own pet opinion. What brings better results, purchasing a faster graphics card, or investing your cash in a more powerful processor? In an effort to find out, Tom’s Hardware has taken a good look at the most important chips. In this article, the Geforce 6800 GT, 7950 GT, 8800 GT, 8800 GTS 512, 9600 GT 1024 and 9800 GTX are up for cross-testing in terms of performance comparisons, and pitted against current CPUs like the E2160, E6750, Q6600 and X6800EE.

http://www.tomshardware.com/reviews/cpu-gpu-upgrade,1928.html



A very welcomed comparison!! Thanks :)
 
Ive seen too many times people have been told that running a s939 at 2.7 and higher wasnt fast enough for their card. At 2.7, a s939 equals about 2.1 C2d, and this just shows what youll get. The only thing thats important here is some games are extremly cpu dependent, and for those select games, the fastest cpu affordable is usually a good idea. But having someone completely update their rig thats used primarily for gaming using say, an older oceed S939 just doesnt bring the benefits. Maybe 25% tops, whereas if theyd just buy a better card, theyd get a huge increase. Like Ive always said, the nest single purchase a gamer can make to better his gaming is abetter gfx card
 
Well s939 is getting crusty, you should dump that thing if you game 😉

I mean more of the ppl saying get a quad core over any dual core cpu, but then I just dumped my 2800 + with a 7600 gs for a 5400 and an 8800gts...and I can tell you that overall it is five times faster not counting HD access, and that 5 times was measured, not guessed at...1960 or so to 9611 3dmark 06 😉.


 
This is what Im saying tho, someone came in wanting to upgrade from a 7900GT to a 88xx card. Someone asked whats the cpu? He was running a S939 at 2.7, and instead of saying get the 88xx, they said itd be a waste, and hed have to upgrade his whole rig, or suffer. This shows thats just not true. There really no difference between the S939 and S940 with AMD, very lttle advantage, and if you take that into consideration, with this article, then it shows that you wont suffer as much as gain. It may turn out a little unbalanced/underperforming, but itd still be a huge improvement in gaming
 

From this one review? Explain to me how a Q6600 @ 3.2GHz can't beat a Q6600 @ 2.4GHz with a 9800GXT, despite medium low res with low fsaa and cpu scaling evident on other cpu's. ? On quick glance, Somethings doesn't add up. Likewise, a 7950GT beats a 9800GTX in crysis? Why test low and very high in the same charts.

I'll look it over in depth later. But at a glance it Sure doesn't pan out with XBit's and other sites findings: http://www.xbitlabs.com/articles/cpu/display/core2quad-q6600_8.html#sect0

BTW, just checked up on this. Firingsquad doesn't show any CPU scaling in COD4 even at low res:
http://www.firingsquad.com/hardware/intel_core_2_duo_e8500_wolfdale/page10.asp



 

Yep, Tomshardware's reviews and benchs have been getting shaky lately. :sarcastic:
 



Careful now before you go trash talking my rig..... 😉
My crusty old socket 939 4600 X2 paired with a BFG 8800GTS 512 OC scores just around 9780 in 3dmark 06.
Course the CPU and GPU are both overclocked just a little....

By the way, just why the hell are there no AMD processors in this test?
Did I miss something, or this is just an "Intel Only" club.
That would really put some light on whether the processor is indeed instrumental with a powerful GPU.
 


@pauldh,

I don't dispute all that, but the fact is right now, for GAMING, most devs are still cutting their teeth and LEARNING, dual cores...much less maxing outt and optimized for dual, and the situation is only compounded for quads. If you can get a quad with the same clock for about the same price sure, go for it, or if you do encoding and stuff for a living or just a lot of stuff like that, again, go for it. GAMING however does not really benefit just yet from quad cores for what they end up sosting vs a higher clocked dualcore. When the devs really get better at multicore, prices will have dropped by then anyways and current quads will be old tech compared to what will be out then.
 
I can't see the point in using AMD stuff - it still performs the same basic functions doesn't it? so the relationship between the basic functions should be the same regardless of who made the hardware
 



That is a good score ! Did I mention mine was stock, so it will be blowing up AFTER yours ?! 😛 lol j/k
 
^lol, yeah, I just got the 8800 about 2 months ago.
But I been running the processor at 2.8 for about 2 years now. It will clock to ~3.0-3.1 and run stable (that's where is has to be to get that score) however it does get pretty darn warm, so I don't leave it there most of the time. I would like to do a complete system upgrade sometime this summer though, just had other things come up that I had to spend the money on so far.
 
i have a few rigs...

amd X2 2.6 with dual 8800gts (g92) 10k 3dmarks + (had a fx 60 which i basically the same cpu with unlocked mulitplier...it scored 11k + on 3dmark overclocked to 2.9)
2.5 phenom w/ 3870 X2 and 3870 13k 3dmarks + (can't get a decent over clock past 2.6...still tweeking. got 14k 3dmarks+ at 2.7 but was unstable in real world games)
q9300 @ 3ghz w/ 3870 X2 and 3870 16k 3dmarks + (at stock 2.5 it scores near the same as the phenom just a hair ahead its the over clocking potential thats huge here)
qx9650 @ 4ghz 3 8800GTX's 21k 3dmarks + (at stock 3ghz i get 15k 3dmarks)
gives u an idea of whats out there

these are all competent at 1920x1080/1920x1200 with full eye candy and playible frame rates (ie COD4, unreal 3, bioshock, assisians creed, front line fuel for war, turok....save crysis.... crysis at meduim to high ok....@ 4 ghz very high is almost playbile in tri sli.)
 

Nah, AMD just failed to provide Socket 775 compatible processors.

It would be interesting to see if the scaling is different with AMD mainboards/CPUs/GPUs. Then again, how likely is that?
 
Well the tests displayed a bit what i was expecting.
For gaming experience you get much more bangs per buck, if you upgrade the GPU instead of the CPU.
The gain is much more if you have a average CPU and a top end GPU then vice-versa.

Nothing to see here.


"Till they take the GPU from my cold dead hands!!!"
 

I think people say this because they went from an AMD XP or a P4 and upgraded straight to the Intel Quad Core. Then they think the massive improvement is because of the quad, when really what makes it so fast is the new architecture, they would have seen the same benefits by merely going Core 2 Duo.
 

Yeah, I agree there is really no real world gaming performance right now between the ubber Dual and Quad core CPU's. They are all excellent and really pretty equal during actual gaming. Max playable settings in most games are typically GPU limited not CPU limited, so if you have enough CPU then there would be little difference. The Quads are just as good at gaming as the duals which has been my point all along, so like you said, for the gamer, it comes down to price. Down the road they could potentially be better like we already see in some games. Only thing I dispute is people who act like the Quad is worse at gaming for whatever reason they care to pull out of a hat. And I do dispute people who were pushing the e8400 at any price, like back when it was more expensive than the Q6600.

Moving onto CPU scaling tests (low res, no fsaa), the Q6600 at high clocks tends to keep up with dual core at even higher clocks. Look at Xbit for an example in all the games they test. But, lets just say for arguments sake, clocked the same the Quad and dual keep up with each other. What I pointed out above is, first the 2.4GHz Q6600 edged out the 3.2GHz Q6600 (something is wrong). Second the 2.67GHz e6750 beat the 3.2GHz Q6600 by 10 fps (something is really wrong). Xbit shows how well the Q6600 scales with higher clocks, even pulling ahead of an even higher clocked dual. I'm a little skeptical right off the bat because of the massive e6750 lead when clocked 500+ MHZ less than the Q6600.
 

Yep, those results simply conflict with all existing benchmarks before it. It's not possible that all existing benchmarks are wrong, therefore that one must be wrong. :sarcastic:

Tomshardware is really slipping lately.
 

Well, I'm not going to jump all over them just yet as I really have not read the review only very quickly glanced at the Crysis and COD4 charts. But the COD4 results I had seen, look like they were done on different systems altogether, not just different CPU's.

Crysis seemed odd that they had no performance boost in the GPU bench from 2.4GHz to 3.2GHz. AT 1680x1050 all high 2xaa/16xaf, I gained 4-5 fps in the GPU bench OC'ing my Q6600 from 2.4 to 3.0GHz. Yet at 12x10 high they gained 0.1 fps going to 3.2GHz. It's like they dropped the mem clocks or something. I will have to test 12x10 high with a single 8800GT to see if I have gains, or if it's only with SLI 8800GT. Is it possible the single 9800GTX was holding things back at 12x10 high?????

How about you. In the Crysis with your system, what difference do you see in the GPU bench by OC'ing your Q?
 

Well, I did notice that the frame rate drop in heavy battle sequences, when there are lots of AI blowing lots of destructable environment, mostly went away at 3.6ghz compared to 2.4ghz. It's a difference between constant 50fps compared to 45-50fps. 😛

Didn't actually bench it, too lazy. 😀