GPU vs. CPU Upgrade: Extensive Tests

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

nowwhatnapster

Distinguished
May 13, 2008
221
0
18,680
I agree with you korsen, there are definitely games out there that will demand more from the CPU than GPU, and vice verse. Another factor to consider before buying.
 

Divath

Distinguished
May 18, 2008
1
0
18,510
[citation]Now lets see Tom produce a chart with minimum CPU requirements for every graphics set out there! via overclock or stock! That would truly save people from creating bottlnecked systems, & overpaying for one part and underpaying another part.[/citation]

Nice idea nowwhatnapster ! Would love to see that as well on here.
 
G

Guest

Guest
Biased review. Using GPU's from 2004 and CPU's from 2007-2008 will of course provide much better results for the GPU. The CPU's of recent won't bottleneck hardly at all and it is purely graphical upgrades yielding a huge percentage. Try using some CPU's from 2004 like the P4 Prescott where there WILL be a bottleneck and GPU upgrades will be worthless because the CPU can't keep up.

Can't believe Toms wouldn't catch such a glaring mistake.
 

marraco

Distinguished
Jan 6, 2007
671
0
18,990
I have made a X-Y chart with this info, plotting price vs. performance. I uploaded it here:

http://bp2.blogger.com/_AIJ1WOm9bKQ/SC9p89QuLqI/AAAAAAAAABA/nmbUpekA9nE/s1600-h/Price+Performance.PNG

The red spots are the hardware that is the cheaper for a given performance, and simultaneously the fastest for a given price.

It shows only four combinations that are worth of a buy:
Geforce 8800 GT OC (512 MB) E2160@2.41
Geforce 8800 GT OC (512 MB) E6750@2.67
Geforce 8800 GTS OC (512 MB) E6750@2.67
Geforce 8800 GTS OC (512 MB) Q6600@3.2

the FPS are plotted with the Geforce 8800 GT OC (512 MB) E2160@2.41 as 100%.

It shows that spending two times his price you can only but less than 30% more performance.

and probably is harder to justify spending more than the price of Geforce 8800 GT OC (512 MB) E6750@2.67. You can spend 100€ more, for only a 8% more speed.

The green points are roughly in the same place, they are the Geforce 8800 GTS OC (512 MB) E6750@2.67 and the Geforce 8800 GTS OC (512 MB) Q6600@3.2.

I would buy the 8800 GTS 512 Q6600. Why? Is almost the same than the lightly cheaper and speedier 8800 GT 512 E6750, but it contains 4 processors, from which future games can take -Probably- advantage.

That chart shows much more readable information that the tables in this article. Any article of this kind must include this kind of chart.
 

marraco

Distinguished
Jan 6, 2007
671
0
18,990
Correction:

The green points are the Geforce 9800 GTX (512 MB) E6750@2.67 and Geforce 8800 GT OC (512 MB) Q6600@3.2, wich, in the near future, may be a better choice over the Geforce 8800 GTS OC (512 MB) E6750@2.67 because the Q6600 have 2 extra processors.

 

Th-z

Distinguished
May 13, 2008
74
0
18,630
Looking at benchmark results for Blacksite and CoD4, I am reallly surprised to see the numbers on E2160 OC'ed@2.4, it even beats Q6600 and two 6s at same or higher clocks with same or faster GPU!!! That is just strange. Even more so for Blacksite, which as mentioned is based on UT3 engine, that makes you wonder what is the reason behind this modern engine that powers a lot of games right now. Isn't it at least make use of more than 2 cores? Looking at other numbers are more in-line with expectation.

Although the general rule is to upgrade GPU for higher perforamce per dolloar, one still has to consider the current setup before making that suggestion to others. Some people may have better CPU than GPU to begin with. The hard part is, at what point is your CPU is bottlenecking your GPU or vice versa without the experimental numbers.

Also this excellent article can add more depth by identifying the diminshing return point for a CPU when paired with a GPU, plot using line charts. For example, for a given CPU, which GPU gives most increase before the increase subsides.


 

Rabbitrunner51

Distinguished
Nov 14, 2003
35
0
18,530
I understand the bias and resoning in showing only Intel C2Dual CPU's,
but that is abit bias in nature. Showing the very best benchmarks would be the obvious reasoning, due to C2Duals outperforming thier counterparts, but in truth it depends upon processor speed version what your comaparing.
I have a X25600 CPU. it is obvioulsy one of the better high end AM2 chips, and it runs at 2.8GHz, which makes up for performance in comparible 2.4GHZ C2Dual chips. Other benchmarks comparing these two for instance, show both very close in benchmark, the C2Dual being only slightly ahead. For this reason alone I think we who have AMD dual chips on at least the higher end should be include. I would be just curious to know what the comparisons are exactly.
I also just upgraded to a 9600GT OC editon GPU, so? I just figured a 6600 and a 9600GT OC'ed were close enough to make me know what performance i can expect.
 

Siiru

Distinguished
May 19, 2008
3
0
18,510
I have a question regarding the required processor speed of between 2.4 to 3Ghz. Is it for dual core processors only or for single ones to?

my current processor is Athlon 64 3000+ (overclocked to 2.35Ghz), I was wondering, would it be ok for me to invest on an 8800GT card and upgrade my processor later? And about bottlenecking, would there be a huge performance drop? what percentage?

Please reply. Thanks!
 

campdude

Distinguished
Apr 20, 2008
73
0
18,640
Quote:
"I didn't realize the new gpus were actually that powerful. According to Toms charts, there is no gpu that can give me double the performance over my x1950 pro. But here, the 9600gt was getting 3 times the frames as the 7950gt(which is better than mine) on Call of Duty 4."

I dont believe either a 9600GT will Triple the X1950 PRO.
Maybe it would double but not triple.... my opinion says its still not valid to upgrade to a 9600 from a x1950 ...
****However they did have a OC and 1 gig version of the 9600GT***
 

emergancy exit

Distinguished
May 5, 2008
44
0
18,530
ok this article basicly shows that there almost no need to buy a quadcore because of the price vs performance increase. why spend an extra $100 for a 20% performance increase when you can spend it on a better vid card for 70% performance increase.


clockspeed is king! sometimes i wonder if instead of a 2.5ghz quadcore i wonder if they made a 10ghz singlecore how much it would own.... it probably wouldent work out very well but i want to see what happens.
 

emergancy exit

Distinguished
May 5, 2008
44
0
18,530
after thinking about it i want to see a chart like that with SLI. to see how upgreadeing a CPU for SLI will use. because upgradeing to SLI will tax a CPU more.

basicly what i want to see is the performance difference of useing 2way SLI 9600 GT on the 1.8 dualcore processor then upgradeing to a E8400 dualcore (3.0ghz) observeing results. (also overclock to 3.6 to observe)

THEN go back to the 1.8 dualcore processor setup. adding a 3rd 9600 GPU and observe results.

 

coldmast

Distinguished
May 8, 2007
664
0
18,980
I want to see new processors,
E8400, E8500,
Q9300, Q9450, Q9650

and SLI on graphics cards

quad-core vs dual-core same clock speed

and theoretical maximum power consumption {need to know how powerful of a PSU to get}
 

philby_37

Distinguished
May 24, 2008
1
0
18,510
Interesting article and charts
I don't see the Nvidia 8800 GTX 756mb card being compared or the Extreme Quad core 6700 cpu
What settings would you suggest work best with this CPU and Video card with FSX and Sp2? Thanks Phil
 

tfm

Distinguished
Apr 2, 2004
8
0
18,510
Sorry, but the figures for Flight Simulator X are complete rubbish. Using "Ultra" high settings, there is simply no way of getting more than about 25 - 30 fps in the Sitka approach with a 3.2 GHz processor, no matter what graphics card it's plugged into. Sure, with a fast enough CPU/GPU combination, you can get 70-80 fps if you want them: but you'll have to turn your settings way down low to do it.

Makes you wonder what purpose this article is supposed to serve.

Tim
 

Viper5030

Distinguished
Jun 5, 2008
17
0
18,510
Why did they decide to run the Crysis benchmarks at different settings? My 7950 GT runs Crysis on High settings at an average of about 25 fps on a single core CPU (Athlon XP 3800+). They could have easily incorporated that into the charts instead of changing the settings per card. Now it's pointless to have a chart of all the benchmarks together.
 
G

Guest

Guest
I've Athlon64 san diego 3700+ 2.2GHz@2.8GHz and nvidia 7800gt with 2GB of ddr550. Upgrading to a better processor would require me to buy a new mobo and memories. Adding to that the new GPU..

Now since I'm running on such a high clocks I've come to think that it would be enough for me to just buy the new GPU. I've been thinking of 9600 GT or 8800 GTS. That would get me one year more atleast with decent graphics and do a full upgrade to once nehalems hit the streets and their prices falls.

I could run some stuff before I upgrade. So I'll report the differences. Well I'm pretty happy with my last full upgrade in 2005 and it seems it goes well for it's 4th year with just a GPU upgrade.
 
G

Guest

Guest
Ok so I bought Asus EN6800GT 512MB Top model for my computer. I ranked up age of conan settings to high and placed antialiasing 16Q and upped some of the view distances.

I'm watching at some long distance view with mobs and water and I'm getting 56 fps. When I fought it wen't to 30. So I placed 8x antialiasing and it's around 36 while fighting and 70 fps most of the time.

It's pretty smooth. I used to get 40-50 fps with low settings on 7800GT. So for my conclusion only upgrading the gpu generation is good. I think with this setting I can go for a long while before doing the full upgrade.
 
G

Guest

Guest
i have a athlon 64 3000+ and 6600 both overclocked. im planning to get a new gpu. i cant afford to upgrade the cpu, motherboard and ram at the same time. im considering among the three: 3650,3850 or 4850. since i have an old cpu i wouldnt be able to maximize the gpu right? would i be fine until i get to upgrade whole pc?
 

pavsko

Distinguished
Jul 19, 2008
1
0
18,510
Thans a lot for great testing.

But one think is strange for me: regarding tests of Flight Simulator X, it does not reflect benefits of multi core CPUs?
Did you turn on multi core usage in FSX.cfg?

By default the multi CPU usage is not turn on in Flight Simulator X.
It is necessary to make change in FSX.cfg (find in your Documents, Application Data, FSX folder)
and add this section:

[JOBSCHEDULER]
AffinityMask=3

where:
AffinityMask=n
n = number of cores scheduled
1 = 1 core
3 = 2 cores
7 = 3 cores
15 = 4 cores
 

fenerli

Distinguished
Jul 24, 2008
1
0
18,510
Excellent tests and results. I would have liked a CPU lower than the E2140 included to see how low a CPU can still achieve significant overall system performance with just a GPU upgrade.

Example in my case: Athlon X2 64 3800+ (would the conclusions from the test CPUs apply to this more lower end CPU?)

If so, that would be awesome, replace the 6600 GT graphics card with a 8800GT or 9600GT for ~US$120 and practically new machine, at least from a gaming stand point. I would be correct in assuming a 2.5X-5X increase right?

Thanks again, and you now have another subscriber :p
 
It is very interesting, and does show how some platform homogeneity can yield best results. However...
I own an Athlon X2 3800+ (a socket 939 one, on an Nvidia 6150 first version, with 2 Gb of DDR400 in dual channel, at native 2.0 GHz). While I don't think this setup can be qualified as powerful anymore, it isn't outdated either (it cost me a pretty penny in 2005) and for proof, equivalent systems are still present in Tom's CPU charts.

But these same charts don't include the E2160, the only similarly clocked Intel processor included is the 4300 - which has an extra 2 Mb of L2 cache.

Now, Intel Core 2 processors are 10 to 25% more powerful at similar clock speeds than K8; however, cache size is also very important for the efficiency of Core 2. It also happens that K8 is no slouch either, and in some cases can reach Core 2 performance per clock. Add to that the fact that fast DDR can make a K8 on s939 slightly faster than an identical chip on AM2.

I end up with a 15% error margin without taking clock speed into account: is my X2 3800+ as powerful as a E2160, slightly slower, or a whopping 25% faster (cumulated error skew + extra clock speed)?

It is also true that Geforces and Radeons don't react the same to CPU power increase (AMD chips aren't hampered as much by low power CPUs, if I'm not mistaken).

I'm not too anxious about the answer, since I replaced the original Geforce 6600 (not GT) with a Radeon HD 4850 last month for impressive results (the GF was never a fast chip, but it was passively cooled). I'm just wondering how much benefit actually overclocking that processor would give (it can reach 2.4 on air with just a thought), but too smart power management in Linux would require either a power manager rewrite (not my idea for fun), or disabling CnQ - and I like quiet.
 

nb2000nb

Distinguished
Sep 7, 2008
2
0
18,510
[citation][nom]DjEaZy[/nom]will there be a AMD/ATI roundup???[/citation]
Why waste the time? They looked at quality products only.
 

salem80

Distinguished
Jul 31, 2008
279
0
18,780
I think My E2180 @3GHz will bee great in games
with GF8600GT (I'm not playing Over 1024 X 768 ).
This is cheapest formula u can get
 
Status
Not open for further replies.