Lowly X2 3800 shines with the 8800GTS

halbhh

Distinguished
Mar 21, 2006
965
0
18,980
What's new info here is that even such a high end video card as an 8800GTS will play nice with the lowly X2 3800. Previously, I pointed out how you can and should spend twice on the video card as the cpu for a gaming machine on a budget. Here you can spend about 3x's as much!

(note this is an 8800GTS. For an 8800GTX you need more cpu horsepower)

Because of a critical fact: framerates above 60 fps are very difficult for the human eye to distinguish, and above 70 impossible. So 95 fps is no better than 70 fps.

When the most demanding current games are played at high resolutions, 1920x, then they are limited by the video card, and the X2 3800 is generally as good as an X2 5200 or e6400 in framerates. (when under 60 fps, an improvement of only 2 fps at 1920x with a better cpu just isn't enough to matter to the eye either)

When these games are played at lower resolutions like 1280x, then the framerates are generally above 60 fps with an x2 3800. So you are already there.

Let me repeat for clarity. The higher framerates at 1280x you could get with a better cpu or overclocking don't matter when those framerates are above 60 fps already with the x2 3800.

[EDIT: since it does require careful reading, let me simplify: the 3800 is basically as good as faster cpus remembering the 60 fps limit of perception in all the games except for 1: Lock On: Modern Air Combat. Only players of this game need more cpu for their 8800GTS for actual real life experience of play.]

Here's a great review of the 8800s with various cpus in most of the best games:

http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/default.asp


That all means you can put a 8800GTS with an x2 3800, and have good results, and save your money for the next generation cpu at the end of this year (or anytime after).

This is a pleasant new idea, where you can build a gaming machine rather cheap with top performance for now, and save money for a drop-in upgrade of your cpu to the next generation that is coming (drop-in compatible with AM2 boards).

Some people like to overclock, and the conclusion is even more so. This is a great cpu choice for actual gaming results.

Some might have more money available, but still want to be frugal. That describes me. I'd think about an x2 4200 up to an x2 5200, since I like to use the QuietNCool feature (these powerful cpus idle a lot, and why burn electricity overclocking them? -- saves some energy and noise).
 

halbhh

Distinguished
Mar 21, 2006
965
0
18,980
Yeah, you're getting spoiled with nice framerates sounds like, and someday you'll have to have that next card. :) By then, perhaps ATI will make it nice pricewise. :)
 

bacoss

Distinguished
Mar 7, 2007
27
0
18,530
i have ben trying to tell this to peopleforever, its mostly about the video card, i can play all games you throw at my computer on AT LEAST med GRF, most i can usaly max out

and i have a AMD 3200+@2.2 (mind you this is a newcastle core built on the 130nm process)
7600GT XXX from XFX (has a bit of a overclock on it from XFX)
some chepo one gig of ram
and all this is a emachine

i find it quite funny
 

1Tanker

Splendid
Apr 28, 2006
4,645
1
22,780
Yeah, you're getting spoiled with nice framerates sounds like, and someday you'll have to have that next card. :) By then, perhaps ATI will make it nice pricewise. :)

Yea when I can see as high as 75fps and no lower than 45fps on WoW on max settings @1280x1024 you get kinda spoiled lol. Not sure about my fps on FFXI but I know I kill their little benchmark program easily so I know I have no problem with it either. :lol:This is something that hal usually omits in his repetitive posts regarding this subject. Minimum FPS is more important. The x2 3800+ might deliver 75-90FPS(which, as he points out, is more than the eye can appreciate) but drop to 40's in intensive scenes(which worsens the playing experience).

The bottom line for those of you with slower AMD X2 CPUs is you’re going to want to overclock your processor a little in order to get the best performance out of GeForce 8800.

In fact a slower CPU does make a quantifiable difference.

In fact if you’ve got an X2 3800+ running at stock speeds, there’s no point in upgrading to a GeForce 8800 GTX, as there were often cases where the 3800+/8800 GTX combination were outrun by the 8800 GTS and a faster CPU like the 4200+ or 4600+.

As usual, halbhh's agenda shows through in his neglecting to present all the facts. What's new? :x
 

AeroB1033

Distinguished
Feb 13, 2007
204
0
18,680
No, but it does suggest that it might be better to spend $100 on a CPU and $300 on a video card than $300 on a CPU and $100 on a video card (for gaming).

And so far, there are no C2Ds that quite fall into that "budget" range... not even the E4300, which is still over $150.
 

halbhh

Distinguished
Mar 21, 2006
965
0
18,980
Just read the charts carefully, and inspect the minimum frame rate tables at the bottom of pages, etc., and get back to me if you still think so, and say precisely why, please.

And read my OP carefully, perhaps, if you didn't already.

I didn't write that article, and I'm reaching slightly different conclusions, based at looking carefully at that data.

You should notice it matters for the Lockon:Air Combat, and not for the other games, to a reasonable person looking to get more for less. You should notice my note about "2 fps" in the OP, and other considerations, especially the 60 fps.

The objective being clearly not about benchmarks, but instead actual playing feel and experience.
 

gallag

Distinguished
May 3, 2006
127
0
18,680
how many posts did you make stating that people should buy a p4 because the difrence compared to a faster x2 didnt realy matter?

THINK ON
 

halbhh

Distinguished
Mar 21, 2006
965
0
18,980
ah!, but....it's throttles it *above* 60 fps, and when below 60 fps throttle the *same as* the faster cpus to within usually 2 fps or less, except for 1 game out of 11.

That's 10 current games of 11 where it is basically as good as faster cpus.

btw, your reference to dropping down to "30 fps" (like in Oblivion foilage area) makes it sound like the X2 3800 is responsible for that! As if a faster cpu with the 8800GTS would rise above the X2 3800 then.

It would not.

Perhaps you need to clarify or change what you are saying, so that it doesn't imply that.
...

This all means the X2 3800 is a great choice for a budget, when the person considers also that they can upgrade that cpu in 12 or 18 months, etc. This is why the 8800GTS cannot be "wasted" in any sense (unlesss someone fries it!).

As to building a top rig, like yours, that's a whole different subject, I think we'd agree.

btw, I'm sure I'd rather have your top flight system myself, say....for $120 more! (honestly) :)
 

Lionhardt

Distinguished
Jul 3, 2006
581
0
18,980
gj halbhh.

overall, games will become multithreaded, but more than that, the graphical complexity will increase

i think that overtime, as the gts ages too, it will be come less bottlenecked
 

bullaRh

Distinguished
Oct 6, 2006
592
0
18,980
i think this OP was the guy who also said we should buy raptors if our budget was more than 1000 :roll: i rather wanna use my $200 on a better graphic card than on a raptor

btw i wanna see the difference from a C2D and a +3800 with a gtx, i think its kinda big.
 
In theory, a high end GPU should be paired with a high end CPU, so as to reduce total system load, and improve performance in the best possible way, but in a budget situation, unless the person is running a older non EE/FX single core chip, pairing a G80 with a 3800+ X2 or similar will not perceivably affect performance (in this case, FPS). Would I recommend doing it? Not really, but this is still good news for those on a budget and looking for a marked increase in gaming performance.
 

halbhh

Distinguished
Mar 21, 2006
965
0
18,980
You're easy enough to satisfy there, but what we find in the linked article is that the 3800 keeps up within about 3 fps with it's faster brethern in all but 1 game out of 11, in *ALL* framerate situations below 60 fps.
 

bullaRh

Distinguished
Oct 6, 2006
592
0
18,980
in first person shooters u wanna keep ur fps above 60fps but games like strategy games ''age of empires'' and ''warcraft 3'' u can go much lower and u wont notice it

but mostly at first person shooters u wanna keep ur fps above 60 but its different from person to person what they can notice
 

halbhh

Distinguished
Mar 21, 2006
965
0
18,980
In theory, a high end GPU should be paired with a high end CPU, so as to reduce total system load, and improve performance in the best possible way, but in a budget situation, unless the person is running a older non EE/FX single core chip, pairing a G80 with a 3800+ X2 or similar will not perceivably affect performance (in this case, FPS). Would I recommend doing it? Not really, but this is still good news for those on a budget and looking for a marked increase in gaming performance.

This is probably the most sensible comprimise to the debate --- good post. Also, referenced your review of thermal compounds.... I still think that is the best data on the web so far.

Jack

I can agree with that too. :)
 
In theory, a high end GPU should be paired with a high end CPU, so as to reduce total system load, and improve performance in the best possible way, but in a budget situation, unless the person is running a older non EE/FX single core chip, pairing a G80 with a 3800+ X2 or similar will not perceivably affect performance (in this case, FPS). Would I recommend doing it? Not really, but this is still good news for those on a budget and looking for a marked increase in gaming performance.

This is probably the most sensible comprimise to the debate --- good post. Also, referenced your review of thermal compounds.... I still think that is the best data on the web so far.

Jack

I can agree with that too. :)
Ah, thank you, to both of you. For some reason, I'm more interested in making compromises, than stating absolutes.

As for the thermal review, I'm going to be updating the info, and running a couple more tests with a few more TIMs
 

halbhh

Distinguished
Mar 21, 2006
965
0
18,980
OK here it is:
Quake 4 running the HOCDEMO.DEMO file, downloadable from HOC Benchmark website, using the x6800 with 8800GTX, varying resolution and AA set to 16X. Command line used, timedemo HOCDEMO.DEMO 1

SMP=off
640x480 110.7
1024x768 110.7
1280x1024 110.8
1600x1200 110.1
1920x1200 108.9

SMP=on
640x480 143.2
1024x768 144.3
1280x1024 144.2
1600x1200 137.0
1920x1200 125.1

Here is my point, regardless of whether you can tell a difference between 100 and 140 FPS, the average in a standard demo is not the whole story. And clearly an X6800 is capable of throttling this card. At standard resolutions, even if you cannot tell a difference, then there is no point getting a 8800 GTS paired with a 3800+ why??? Because the 3800+ is limiting, so why not get a 7600GT which will give you the same framerates.

Finally, the threshold for looking at a game is not 30 FPS but 60, why because if you do not sync to 60 FPS, anything below that will start to show tearing.... as the refresh rate of the monitor will catch various frames mid render. Thus, if you are shooting for above 60 FPS ensure that the CPU will not throttle the minimum frame rate below 60.... this will ensure the smoothest gaming experience.

:) This debate will rage forever I suspect :)

Thanks for the data, but the reason to not get a 7600GT instead of the 8800GTS is exactly that you *won't* get the same framerates, you will get *better* framerates with the 8800 GTS in just about all games most all the time.

In the linked article, they also show 2 other cards, and you'll see what I mean.

Your X6800 *won't* get significantly better framerates than the X2 3800 in the Oblivion foilage!

If you haven't carefully looked over those charts and the min framerate tables, that's where you'll get what I'm saying.
 

bullaRh

Distinguished
Oct 6, 2006
592
0
18,980
yea u will get higher framerates but its a little sad u cant get the last 35 fps out of your $500 gpu because your cpu is holding you back aye?
 

halbhh

Distinguished
Mar 21, 2006
965
0
18,980
Your X6800 *won't* get significantly better framerates than the 30 fps the X2 3800 gets in the Oblivion foilage!

(reference is to the Oblivion minimum framerate with a 8800GTS in the foilage area, as in the table at the bottom of:
http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/page12.asp )


If you haven't carefully looked over those charts and the min framerate tables, that's where you'll get what I'm saying.
 

RichPLS

Champion
Your X6800 *won't* get significantly better framerates than the X2 3800 in the Oblivion foilage!

If you haven't carefully looked over those charts and the min framerate tables, that's where you'll get what I'm saying.

Looking carefully at aforesaid and incessantly insisted upon charts/tables... :roll:
and using the X2-3800 and the GTS data...
I see in LockOn hitting 38fps 1600x1200
and Battlefield 58fps 1600x1200
and Oblivion 47fps 1600x1200
and Oblivion Performance worse at 30fps 1600x1200
FarCry Perf at 53fps
...and I am to believe that by using the GTS on my X6600 oc'd to 2x3.2GHz will not significantly improve those framerates over what an X2-3800 will achieve?!?