GPU vs. CPU Upgrade: Extensive Tests

Status
Not open for further replies.

randomizer

Champion
Moderator
That would simply consume more time without really proving much. I think sticking with a single manufacturer is fine, because you see the generation differences of cards and the performance gains compared to geting a new processor. You will see the same thing with ATI cards. Pop in an X800 and watch it crumble in the wake of a HD3870. There is no need to inlude ATI cards for the sake of this article.
 

yadge

Distinguished
Mar 26, 2007
443
0
18,790
I didn't realize the new gpus were actually that powerful. According to Toms charts, there is no gpu that can give me double the performance over my x1950 pro. But here, the 9600gt was getting 3 times the frames as the 7950gt(which is better than mine) on Call of Duty 4.

Maybe there's something wrong with the charts. I don't know. But this makes me even more excited for when I upgrade in the near future.
 
G

Guest

Guest
This article is biased from the beginning by using a reference graphics card from 2004 (6800GT) to a reference CPU from 2007 (E2140).

Go back and use a Pentium 4 Prescott (2004) and then the basis of these percentage values on page 3 will actually mean something.
 

randomizer

Champion
Moderator
[citation][nom]yadge[/nom]I didn't realize the new gpus were actually that powerful. According to Toms charts, there is no gpu that can give me double the performance over my x1950 pro. But here, the 9600gt was getting 3 times the frames as the 7950gt(which is better than mine) on Call of Duty 4. Maybe there's something wrong with the charts. I don't know. But this makes me even more excited for when I upgrade in the near future.[/citation]
I upgraded my X1950 pro to a 9600GT. It was a fantastic upgrade.
 

wh3resmycar

Distinguished
[citation][nom]scy[/nom]This article is biased from the beginning by using a reference graphics card from 2004 (6800GT) to a reference CPU from 2007 (E2140).[/citation]

maybe it is. but its relevant especially with those people who are stuck with those prescotts/6800gt. this article reveals an upgrade path nonetheless
 
Great article!!! It clears up many things. It finally shows proof that the best upgrade a gamer can make is a newer card. About the P4's, just take the clock rate and cut it in half, then compare (ok add 10%) heheh
 

justjc

Distinguished
Jun 5, 2006
235
0
18,680
I know randomizer thinks we would get the same results, but would it be possible to see just a small article showing if the same result is true for AMD processors and ATi graphics.
Firstly we know that ATi and nVidia graphics doesn't calculate graphics in the same way, who knows perhaps an ATi card requiers more or less processorpower to work at full load, and if you look at Can you run it? for Crysis(only one I recall using) you will see the minimum needed AMD processor is slover than the minimum needed Core2, even in processor speed.
So any chance of a small, or full scale, article throwing some ATi and AMD power into the mix?
 

randomizer

Champion
Moderator
In the case of processors, throwing in some AMD chips would be a good idea as they are often a fair bit slower than a similarly priced C2D. However, I don't think having ATI cards in the mix would show up anything really different than what we have now. The higher end cards will be bottlenecked on a slow processor while the slower cards won't be bottlenecked as badly. A 3870X2 will need a more powerful CPU to reach maximum potential just like a 9800GX2. Of course, the amount of CPU power needed will almost definitely be different, but the overall conclusion is that buying a next-gen (or current-gen rather) card is going to benefit you more than a new CPU unless it's real old. That is all the article is trying to prove, not which CPU/video card is best.
 

LuxZg

Distinguished
Dec 29, 2007
225
42
18,710
I've got say I agree about no need to add ATI cards, as they are easily comparable with the selection of nVidia cards shown in article.

But it lacks comparision with single core CPU's. In the times of 6800GT's popularity we've had single core Athlons like Barton, and perhaps early Athlons/Semprons with 754 socket. As adding AGP system would complicate things way too much, I think at least Athlon64 socket 939 could be used, as those were very popular in between/during 6800GT and 7xxx series.

At very least we should have one set of benchmarks on low/old CPU like that, so we can see if buying faster card is of any use at all, or will we reach a bottom of performance the same way that we see 6800GT unable to use additional power from new CPUs.

Other than this, it's a great article!

PLEASE - make one more roundup like this once GT200/4800 cards are out!

P.S. And nice to see that 9800GTX, and overclocked Q6600 are still right on 300W consumption, meaning any quality 400/450W supply is more than enough for them!
 
Onya ... thanks for pinching the idea from our thread on the subject buddy !!! Tou guys must troll our topics looking for good ideas eh??

Seriously though ... cheers and thanks !!

Hey ... update the article with a couple of AMD cpu's thrown into the mix ... perhaps a 3800, 5000. and a 6400 and a couple of 50 series Phenoms.

Thanks again...
 

cah027

Distinguished
Oct 8, 2007
456
0
18,780
This is a really cool article. I think it should be updated with every new generation of new parts (ex: new ATi and Nvidia and Nahelem and Fusion)

I wonder if Nehalem will be such a big boost if it will show gains across the board of GPUs ?
 

danatmason

Distinguished
Apr 8, 2008
11
0
18,510
I've been waiting for an article like this! Great to have figures to back up the idea of a cpu bottleneck. I tried to pair an 8800GT with an Athlon x2 4000+ at stock speed and I was HUGELY disappointed. But with a quick OC of the processor, I'm sitting happy! So I imagine those results scale similarly with AMD processors and the same idea - clock speed matters more for games, not cores - will still hold true.
 

royalcrown

Distinguished
there's no need to throw a bunch of AMD processors in here for one reason, well two...

1. You can see on the cpu charts where the AMD you have compares with an intel...so it would perform about the same with the same card.

2. cpu's in the same FAMILY on the same architecture will scale the same between them relative to the way the c2 duo's scale to each other.
 

ubbadubba

Distinguished
May 15, 2008
3
0
18,510
the E8xxx and E7xxx are not mentioned. Please either include in next edition (along with a few AMD CPUs) or comment on which parts they would mimic. Like, does the E8400 @ 3GHZ behave like a E2160 @3GHz, or is the E8400 still a little bit better by X%. E7200 and E8200 are fairly cheap for new CPU models, so that would be nice for the next go round.

A game that's missing is UT3 -- it favorably benefits from X3 AMD CPUs as a cost-effective solution, whereas the X3 may not be as good in other games. But maybe that would throw off overall results if 1 game did really well and did not follow the same trends as other games.

and yes we need AMD CPUs to see the scaling effects across the different GPUs. The interaction of weak CPUs with strong GPUs or vice versa is not represented in the CPU charts.

speaking of which: when are the GPU charts going to get updated with modern GPUs and games?
 
G

Guest

Guest
What an incredible article! My jaw dropped to the floor at the cost per frame chart. Really nice work! Now, the holy grail is cross-linking article data like this to the CPU and Vid charts. I am very impressed.
 

a 6pack in

Distinguished
Nov 12, 2007
157
0
18,680
[citation][nom]DjEaZy[/nom]will there be a AMD/ATI roundup???[/citation]
that is something that i am really curious in. I've been bummed out that i killed my quad and back to dual. I guess its not really noticable..

 

hcforde

Distinguished
Feb 9, 2006
313
0
18,790
SCY,(15/05/2008@10:10) most people may have a greater tendancy to upgrade their GPU over their CPU. Maybe that is why it was done. Us techies may not see it that way but the general market may.
 

royalcrown

Distinguished
nd yes we need AMD CPUs to see the scaling effects across the different GPUs. The interaction of weak CPUs with strong GPUs or vice versa is not represented in the CPU charts.

They could just as easily show that with bottom rung core 2 based pentiums.
 

game_over_player1

Distinguished
May 15, 2008
1
0
18,510
Correction:

Nice article, but you have to wonder if everything else was set Apples to Apples as water settings are fully adjustable and have many ranges from min to max all with various degrees of impact on FPS.

"The Geforce 6800 GT and 7950 GT only run with DirectX 9 effects. In this mode, the environment is not reflected in the water, but the waves are simulated cleanly by the pixel shader"

This is not correct. DX9 version of FSX displays the landscape, clouds and the aircraft reflections in the water on any old DX9 Card including Geforce 5 cards.
There are subtle enhancements to the DX10 reflections that most consider unrealistic. But both DX9 and DX10 offer complete reflections in the water.
 
Status
Not open for further replies.