Core E4300 or a X2 3600?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Boys,

Let's not start a huge argument out of it. I just want my HL2 to load quickly, like less than 10 seconds on it. My current rig is like 30 seconds on it and it's quite like forever.

Also, I'm into mapmaking, and maybe I'll be starting to do it again. Been mapmaking since 01 (HL classic) , but stopped because I had a weaker K6-2, and since then I didn't do much mapping as it took too long time to do it.

On top of that, here's my summary:

1.) I have a DDR2-667 RAM. If I plan to save some more dough (C2D is still a little bit too pricey, for even a E4300), is it good to use an X2-3600?

2.) No overclocking is done at all, since this is a central PC at home. Period.

3.) Futureproofing - I wanna play games which have dual-core support.

That's all. If loading HL2 cuts the time into half - I'm moving to that particular processor. If that E4300 loads like um... only 5 seconds, I'll pick that for sure. If that X2-3600 squeezes out a 10 second load, I'll choose that.

Opinions?
1.) The X2-3600 is fast enough for any game but the E4300 would be better for map editing.
2.) OCing is safe as long as you dont push the CPU to its limit. An OC like my sig is safe and well below any heat problems as most CPUs run with no problem up to 58c. This want work on a stock cooler but even a cheap non-stock cooler with a 65nm is good for some OCing.
3.) For futureproofing you need to look at how much RAM your mobo will max out to and what GPU slot your mobo has.
 

shadowmaster625

Distinguished
Mar 27, 2007
352
0
18,780
On top of that, here's my summary:

1.) I have a DDR2-667 RAM. If I plan to save some more dough (C2D is still a little bit too pricey, for even a E4300), is it good to use an X2-3600?

2.) No overclocking is done at all, since this is a central PC at home. Period.

3.) Futureproofing - I wanna play games which have dual-core support. I know the Socket 775 will be still around until next year or whatever it is, so if I buy the E4300 combo, later when these are getting cheaper, a new E6600 will be sittin' on top of it instead. But I know I can't do it on an AM2 since the AMD guys liked changing sockets which is very annoying sometimes. (no, I'm no intel or AMD fanboy - please take note.)

Actually the AM2 is much better upgrade candidate since you only gotta spend $60 now to get into it and a year from now get an upgrade for less combined cost.

You gotta be careful when moving from a 3+GHz netburst down to a 1.8GHz modern cpu. In many cases the new uA cannot totally make up for the loss of raw clock speed. If your 3.06 P4 has HTT I doubt you will see better than a 30% improvement in load times. If current load times are 30 sec you're probably looking at about 20sec at best with the 3600, especially since your memory bandwidth wont be increasing any.
 
All I am saying is that HL2 scales more with CPU clockspeed than most games, which are generally more GPU bound.

Do you have hands on experience playing HL2 and it's spinoffs like CS:S? If you did you'll realise how CPU bound it is, especially in multiplayer, where the physics model had to be TONED DOWN by Valve because it was too taxing on CPUs.
I have a son thats into CS:S heavy and we both have copys so yes I have played a bit. Note page 6 of your review where the gains slow above 1600MHz on the CPU. At the top CPU you only see a change of 20FPS but with just high end cards the change was 37FPS. If it was the case the CPU was bound on page 6 you would see almost no gains from the only slightly better GPU's. I see big differances in the the GPU's not the CPU. The only CPU bound game I know of is Oblivion. Crysis maybe but its not released yet.
http://anandtech.com/cpuchipsets/showdoc.aspx?i=2330&p=6
 

The_YongGrand

Distinguished
May 3, 2007
15
0
18,510
Actually the AM2 is much better upgrade candidate since you only gotta spend $60 now to get into it and a year from now get an upgrade for less combined cost.

You gotta be careful when moving from a 3+GHz netburst down to a 1.8GHz modern cpu. In many cases the new uA cannot totally make up for the loss of raw clock speed. If your 3.06 P4 has HTT I doubt you will see better than a 30% improvement in load times. If current load times are 30 sec you're probably looking at about 20sec at best with the 3600, especially since your memory bandwidth wont be increasing any.

Oh I see.

I don't know, but the X2-3600 looks limited on the other side. If I use a E4300, is it a major leap ahead from P4? I know Core architecture are way better than any Netbursts available, so the lower gigahertz rating won't matter in Cores.

As I mentioned, my RAM stick isn't DDR2-800. Just DDR2-667.

The X2 is just two Athlon64s 3000+ fused together, and one core from these couple will be just as fast as my P4 or slightly slower. Is it true?
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780
Actually the AM2 is much better upgrade candidate since you only gotta spend $60 now to get into it and a year from now get an upgrade for less combined cost.

You gotta be careful when moving from a 3+GHz netburst down to a 1.8GHz modern cpu. In many cases the new uA cannot totally make up for the loss of raw clock speed. If your 3.06 P4 has HTT I doubt you will see better than a 30% improvement in load times. If current load times are 30 sec you're probably looking at about 20sec at best with the 3600, especially since your memory bandwidth wont be increasing any.

Oh I see.

I don't know, but the X2-3600 looks limited on the other side. If I use a E4300, is it a major leap ahead from P4? I know Core architecture are way better than any Netbursts available, so the lower gigahertz rating won't matter in Cores.

As I mentioned, my RAM stick isn't DDR2-800. Just DDR2-667.

The X2 is just two Athlon64s 3000+ fused together, and one core from these couple will be just as fast as my P4 or slightly slower. Is it true?

If you plan on keeping your current RAM then I suggest the E4300, since C2D does not suffer nearly as much from lower spec RAM as AM2 does, plus it'll allow you to overclock to 3GHz while keeping the RAM in spec.

As many people have said already, the E4300 is faster, and overclocks better, but it comes at a cost. You'll most likely see the biggest difference in mapmaking, since I believe both CPUs are more than fast enough to run the game at 60fps+ once overclocked.

Another poster said to be wary about moving from a ~3GHz Netburst to a ~2GHz X2/C2D. This is a very good point. If you don't plan on overclocking then the X2 3600+ (and to a lesser extent the E4300) won't be a hell of a lot quicker than your existing P4. However once you overclock the chips you should see a much bigger performance gulf.
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780
All I am saying is that HL2 scales more with CPU clockspeed than most games, which are generally more GPU bound.

Do you have hands on experience playing HL2 and it's spinoffs like CS:S? If you did you'll realise how CPU bound it is, especially in multiplayer, where the physics model had to be TONED DOWN by Valve because it was too taxing on CPUs.
I have a son thats into CS:S heavy and we both have copys so yes I have played a bit. Note page 6 of your review where the gains slow above 1600MHz on the CPU. At the top CPU you only see a change of 20FPS but with just high end cards the change was 37FPS. If it was the case the CPU was bound on page 6 you would see almost no gains from the only slightly better GPU's. I see big differances in the the GPU's not the CPU. The only CPU bound game I know of is Oblivion. Crysis maybe but its not released yet.
http://anandtech.com/cpuchipsets/showdoc.aspx?i=2330&p=6

You have to realise that the page you are linking to is showing significant gains in CPU scaling, especially with the top end GPU of the time (X850XT).

My point all along is that with GPUs being 2 - 4 times as powerful as the X850XT, you will see even more CPU scaling in HL2 now than 2 years ago.

But as I've said many times already, it's quite moot if an overclocked E4300 gets 200fps while the overclocked X2 3600+ gets 150fps.
 

The_YongGrand

Distinguished
May 3, 2007
15
0
18,510
Actually the AM2 is much better upgrade candidate since you only gotta spend $60 now to get into it and a year from now get an upgrade for less combined cost.

You gotta be careful when moving from a 3+GHz netburst down to a 1.8GHz modern cpu. In many cases the new uA cannot totally make up for the loss of raw clock speed. If your 3.06 P4 has HTT I doubt you will see better than a 30% improvement in load times. If current load times are 30 sec you're probably looking at about 20sec at best with the 3600, especially since your memory bandwidth wont be increasing any.

Oh I see.

I don't know, but the X2-3600 looks limited on the other side. If I use a E4300, is it a major leap ahead from P4? I know Core architecture are way better than any Netbursts available, so the lower gigahertz rating won't matter in Cores.

As I mentioned, my RAM stick isn't DDR2-800. Just DDR2-667.

The X2 is just two Athlon64s 3000+ fused together, and one core from these couple will be just as fast as my P4 or slightly slower. Is it true?

If you plan on keeping your current RAM then I suggest the E4300, since C2D does not suffer nearly as much from lower spec RAM as AM2 does, plus it'll allow you to overclock to 3GHz while keeping the RAM in spec.

As many people have said already, the E4300 is faster, and overclocks better, but it comes at a cost. You'll most likely see the biggest difference in mapmaking, since I believe both CPUs are more than fast enough to run the game at 60fps+ once overclocked.

Another poster said to be wary about moving from a ~3GHz Netburst to a ~2GHz X2/C2D. This is a very good point. If you don't plan on overclocking then the X2 3600+ (and to a lesser extent the E4300) won't be a hell of a lot quicker than your existing P4. However once you overclock the chips you should see a much bigger performance gulf.

Thanks,

I'll pick the E4300. I know - dropping from a Netburst to a Core will be risky, but I'm futureproofing it for some more games. Who knows they are multithreaded? No overclocking is done, so I'll just leave the chip alone. And you said that the Cores work best on DDR-667, since AMD only have max performance on DDR2-800. What do you think?
 

kitchenshark

Distinguished
Dec 30, 2005
377
0
18,780
If you want to see faster load times on maps, consider a Raptor hard drive or Raid 0. The biggest leap in load times I experienced was using my Raptor and after I'd OC'd my E4300 to 2.97Ghz... In HL2 it loads the initial map in about 30-45 secs and an in-map load usually takes 15-20 seconds. Maybe a little less, never sat down and timed it precisely.
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780
Thanks,

I'll pick the E4300. I know - dropping from a Netburst to a Core will be risky, but I'm futureproofing it for some more games. Who knows they are multithreaded? No overclocking is done, so I'll just leave the chip alone. And you said that the Cores work best on DDR-667, since AMD only have max performance on DDR2-800. What do you think?

Well, in your circumstances I would say it's a good choice. Being able to keep your existing RAM will cut down on costs, and you'll see more of a noticeable improvement over the P4 with the E4300.

You may have to upgrade your mobo though, if it's a 945 chipset it *may* support C2D with a BIOS upgrade, some do some don't.
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780
Well the 3600+ is obviously the best buy for the money. $59, $69 retail now at newegg I think, and everyone is overclocking them to 3ghz.

'Everyone'? How many people are running an X2 3600+ @ 3GHz in these forums, the highest I've seen is 2.8GHz. Sure, *some* people are getting 3GHz on other forums, but that is the upper limit more than the norm.

An E4300 is ~$115 retail and should overclock to 3GHz easily, 3.5GHz if you have good cooling. At such speeds it would blow away any overclocked X2. You'd need an X2 @ 4.2GHz to compete... which ain't gonna happen short of LN2.

Overall platform cost would be about $50 - $100 higher for the E4300 depending on the mobo, but don't act as if the X2 3600+ has an edge in overclocking, because it will utterly get spanked by the E4300.

As an owner of an E6300 @ 3.2GHz, you should probably know that already. If the X2 3600+ is such a great overclocker why don't you just replace your current system with one then? :lol: :wink:

By everyone I mean the majority. Check the forums.
If the chip had been out at the time I upgraded my AM2 system, I surely would have bought it.

What forums? I frequent these forums the most, and I've yet to see someone run an X2 3600+ @ 3GHz. Look at Elbert in this thread, he is running it at 2.36GHz, and while I'm aware thats a conservative overclock, please don't try to insinuate the majority of people are running X2 3600+s at 3GHz because that is clearly not the case.

You don't see people claiming E4300s do 100% 3.6GHz overclocks all the time because that is more of an UPPER LIMIT than the NORM. Same as 3GHz for the X2 3600+.
 
All I am saying is that HL2 scales more with CPU clockspeed than most games, which are generally more GPU bound.

Do you have hands on experience playing HL2 and it's spinoffs like CS:S? If you did you'll realise how CPU bound it is, especially in multiplayer, where the physics model had to be TONED DOWN by Valve because it was too taxing on CPUs.
I have a son thats into CS:S heavy and we both have copys so yes I have played a bit. Note page 6 of your review where the gains slow above 1600MHz on the CPU. At the top CPU you only see a change of 20FPS but with just high end cards the change was 37FPS. If it was the case the CPU was bound on page 6 you would see almost no gains from the only slightly better GPU's. I see big differances in the the GPU's not the CPU. The only CPU bound game I know of is Oblivion. Crysis maybe but its not released yet.
http://anandtech.com/cpuchipsets/showdoc.aspx?i=2330&p=6

You have to realise that the page you are linking to is showing significant gains in CPU scaling, especially with the top end GPU of the time (X850XT).

My point all along is that with GPUs being 2 - 4 times as powerful as the X850XT, you will see even more CPU scaling in HL2 now than 2 years ago.

But as I've said many times already, it's quite moot if an overclocked E4300 gets 200fps while the overclocked X2 3600+ gets 150fps.
I realize you are trying to prove a point with outdated information. You first quoted this review in poor tasted. And as stated its quite moot an X2 3600 with a 8800GTS can get higher fps than and E4300 with a 8600gts for about the same price. The Page i link show a falling off in impovements above a 1600MHz CPU and shows a large difference in even the top GPU's. Any CPU sold today can handly this game as its getting quite old and is in no way bound by any current CPU. For a game to be CPU bound the changes in GPU's would show little or no change. Even this old test shows this not to be so.

Now if you want to bound the game to the CPU you can get fps like you state but at higher resolutions the same GPU will show little change in FPS. But as I stated for the price you can get an 8800 with the X2 3600 for what an e4300 and a 8600 costs.
 

zenmaster

Splendid
Feb 21, 2006
3,867
0
22,790
I would not saying I'm agreeing.

I think the $75 for upgrading from an X2 to C2D is more than worth it.
If you have very minimal funds, an X2 could suffice.

I would suggest the poster spend the afternoon washing a few cars or perhaps raking leaves for a neighbor and get the funds to upgrade to the C2D.
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780
No game except perhaps FS-X can really be described as totally CPU bound. Even Oblivion is more GPU bound than CPU bound, but it still shows CPU scaling in certain situations, as I'm sure you're well aware.

To be fair, the HL2 engine has been upgraded to support HDR so it does tax GPUs more than it did when the article was done, but I still think it will exhibit more CPU scaling now than it did in 2005.

I guess I should clarify that by 'CPU bound' I don't mean the game is totally CPU limited, I am just saying that HL2 shows greater scaling from increased CPU speeds than many other games/game engines, new or old.

Here are some NEW HL2: Lost Coast benchmarks showing CPU scaling with the latest 8800 GPUs and X2/C2D CPUs:
http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/page5.asp
http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_core_2_performance/page8.asp
 
I would not saying I'm agreeing.

I think the $75 for upgrading from an X2 to C2D is more than worth it.
If you have very minimal funds, an X2 could suffice.

I would suggest the poster spend the afternoon washing a few cars or perhaps raking leaves for a neighbor and get the funds to upgrade to the C2D.
I think the $75 would be better spent on a better GPU. IE 8800gts instead of a 8600gts. Game wise the GPU is the better choice and I wouldnt spend an extra $75 for a CPU with a 2~3 FPS increase at 1600X1200. The map editing is the only thing needing the e4300 if the OP does that a enough to warrent the $75.
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780
Now if you want to bound the game to the CPU you can get fps like you state but at higher resolutions the same GPU will show little change in FPS. But as I stated for the price you can get an 8800 with the X2 3600 for what an e4300 and a 8600 costs.

I know we are getting off topic (and I apologise to the OP but I believe I have responded to his initial questions already) but I honestly don't think an X2 3600+ (at stock speeds) is a good match for an 8800 class GPU. It would be bottlenecking it in many cases, as shown in this article: http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/default.asp

I know at 1600x1200 CPU bottlenecking diminishes, but 1280x1024 is still the most popular resolution amongst gamers, since most 17"/19" LCDs use this resolution.

Of course, if I had to choose between an X2 3600+/8800GTS or an E4300/8600GTS for gaming, I would surely choose the X2 3600+/8800GTS config. But an E4300/8800GTS would be a more balanced system, budget permitting of course. :wink:
 

zenmaster

Splendid
Feb 21, 2006
3,867
0
22,790
And where does the poster say he can't buy both?

Have you never gone to a restaurant and ordered both spaghetti and meatballs? You don't have to limit yourself to just spaghetti or just meatballs.

Go live life! Have both!

I don't know about you, but a $75 difference in system price is not major.
The system performance is.

If he is a poor college kid working 3 jobs to make ends meet and pay for college on his own, maybe $75 is a big deal. If he is like most folks its not. Its simply a matter of watching a movie and having dinner at home one saturday evening instead of having date night out.
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780
And where does the poster say he can't buy both?

Have you never gone to a restaurant and ordered both spaghetti and meatballs? You don't have to limit yourself to just spaghetti or just meatballs.

Go live life! Have both!

I don't know about you, but a $75 difference in system price is not major.
The system performance is.

If he is a poor college kid working 3 jobs to make ends meet and pay for college on his own, maybe $75 is a big deal. If he is like most folks its not. Its simply a matter of watching a movie and having dinner at home one saturday evening instead of having date night out.

I know what you are saying. I'm building a new rig and was tempted by the X2 3600+ due to the low price. But in the end I opted for the E4300 because it only represented a ~$50 difference in system price, and since I'm spending around $1000 anyway that $50 really wasn't too big a deal for ~30% greater CPU performance after overclocking.

I may still end up with an X2 3600+ as a secondary/lounge room PC, they are just too cheap to pass up! 8)
 
Now if you want to bound the game to the CPU you can get fps like you state but at higher resolutions the same GPU will show little change in FPS. But as I stated for the price you can get an 8800 with the X2 3600 for what an e4300 and a 8600 costs.

I know we are getting off topic (and I apologise to the OP but I believe I have responded to his initial questions already) but I honestly don't think an X2 3600+ (at stock speeds) is a good match for an 8800 class GPU. It would be bottlenecking it in many cases, as shown in this article: http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/default.asp

I know at 1600x1200 CPU bottlenecking diminishes, but 1280x1024 is still the most popular resolution amongst gamers, since most 17"/19" LCDs use this resolution.
Nice try but if you benchmarked the 8800gts against the 8600gts you would see more and 2~3 fps with a X2 3600. You link of even old GPU's dont show bottlenecks at 1280X1024 as the lowest on the list isnt that far of the front runner. If you look at the benchmark bound CPU's would show little if any change in FPS. Move it down to 800X600 and that test would ring true but not at 1280X1024. So your 1280x1024 being the most popular resolution amongst gamers doesnt help your review link.
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780
Now if you want to bound the game to the CPU you can get fps like you state but at higher resolutions the same GPU will show little change in FPS. But as I stated for the price you can get an 8800 with the X2 3600 for what an e4300 and a 8600 costs.

I know we are getting off topic (and I apologise to the OP but I believe I have responded to his initial questions already) but I honestly don't think an X2 3600+ (at stock speeds) is a good match for an 8800 class GPU. It would be bottlenecking it in many cases, as shown in this article: http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/default.asp

I know at 1600x1200 CPU bottlenecking diminishes, but 1280x1024 is still the most popular resolution amongst gamers, since most 17"/19" LCDs use this resolution.
Nice try but if you benchmarked the 8800gts against the 8600gts you would see more and 2~3 fps with a X2 3600. You link of even old GPU's dont show bottlenecks at 1280X1024 as the lowest on the list isnt that far of the front runner. If you look at the benchmark bound CPU's would show little if any change in FPS. Move it down to 800X600 and that test would ring true but not at 1280X1024. So your 1280x1024 being the most popular resolution amongst gamers doesnt help your review link.

I'm not even sure what we're arguing about anymore. Are you disputing HL2 shows greater CPU scaling than many other games?

I think any enthusiast would know an X2 3600+/8800GTS is better than an E4300/8600GTS for gaming. They would also be aware that an E4300/8800GTS would be a more balanced system in terms of CPU vs GPU power.

Reality check: A stock X2 3600+ is slower than a 3 year old A64 3200+ in single threaded gaming. It will bottleneck a 8800GTS in many cases at 1280x1024. I (and many others) happen to own a 1280x1024 LCD so while you may not find it relevant to yourself, it is relevant to a LOT of people.

I'm sorry, what are we discussing now?
 
And where does the poster say he can't buy both?

Have you never gone to a restaurant and ordered both spaghetti and meatballs? You don't have to limit yourself to just spaghetti or just meatballs.

Go live life! Have both!

I don't know about you, but a $75 difference in system price is not major.
The system performance is.

If he is a poor college kid working 3 jobs to make ends meet and pay for college on his own, maybe $75 is a big deal. If he is like most folks its not. Its simply a matter of watching a movie and having dinner at home one saturday evening instead of having date night out.
]
LOL thats about the differance in a 8600GTS and a 8800GTS 320mb. Yes the system performance is major for a GPU change in game over a CPU change.
 
Now if you want to bound the game to the CPU you can get fps like you state but at higher resolutions the same GPU will show little change in FPS. But as I stated for the price you can get an 8800 with the X2 3600 for what an e4300 and a 8600 costs.

I know we are getting off topic (and I apologise to the OP but I believe I have responded to his initial questions already) but I honestly don't think an X2 3600+ (at stock speeds) is a good match for an 8800 class GPU. It would be bottlenecking it in many cases, as shown in this article: http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/default.asp

I know at 1600x1200 CPU bottlenecking diminishes, but 1280x1024 is still the most popular resolution amongst gamers, since most 17"/19" LCDs use this resolution.
Nice try but if you benchmarked the 8800gts against the 8600gts you would see more and 2~3 fps with a X2 3600. You link of even old GPU's dont show bottlenecks at 1280X1024 as the lowest on the list isnt that far of the front runner. If you look at the benchmark bound CPU's would show little if any change in FPS. Move it down to 800X600 and that test would ring true but not at 1280X1024. So your 1280x1024 being the most popular resolution amongst gamers doesnt help your review link.

I'm not even sure what we're arguing about anymore. Are you disputing HL2 shows greater CPU scaling than many other games?

I think any enthusiast would know an X2 3600+/8800GTS is better than an E4300/8600GTS for gaming. They would also be aware that an E4300/8800GTS would be a more balanced system in terms of CPU vs GPU power.

Reality check: A stock X2 3600+ is slower than a 3 year old A64 3200+ in single threaded gaming. It will bottleneck a 8800GTS in many cases at 1280x1024. I (and many others) happen to own a 1280x1024 LCD so while you may not find it relevant to yourself, it is relevant to a LOT of people.

I'm sorry, what are we discussing now?
CPU bound propertys of HL2 and system price differances. Come on Epsilon84 try and keep up. If you need me to agree with you find an Oblivion benchmark with X2 3600 and ill agree its bottlenecked. Now the reality check is even a 3 year old A64 didnt show to be CPU bound. I use a 1280X1024 CRT and see no CPU bounds unless I play Oblivion.
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780
LOL thats about the differance in a 8600GTS and a 8800GTS 320mb. Yes the system performance is major for a GPU change in game over a CPU change.

I'm not quite sure why you are so obsessed with the 8800GTS vs 8600GTS comparison.

You seem hell bent on comparing a X2 3600+ vs E4300 system at the same pricepoint. Why is that? I think it's clear to everyone that for a comparable config to an X2 platform, the C2D will cost $50 - $100 more. With that comes greater CPU performance. It's really quite simple, I'm quite amazed we are still discussing this topic.
 
LOL thats about the differance in a 8600GTS and a 8800GTS 320mb. Yes the system performance is major for a GPU change in game over a CPU change.

I'm not quite sure why you are so obsessed with the 8800GTS vs 8600GTS comparison.

You seem hell bent on comparing a X2 3600+ vs E4300 system at the same pricepoint. Why is that? I think it's clear to everyone that for a comparable config to an X2 platform, the C2D will cost $50 - $100 more. With that comes greater CPU performance. It's really quite simple, I'm quite amazed we are still discussing this topic.
Simple its about the same price differance of the X2 3600 and the E4300. God Epsilon84 you fall behind fast. Now is that $100 price worth 2~3 FPS? I say no and the OP should only base the CPU choice on map making.
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780
Come on Epsilon84 try and keep up. If you need me to agree with you find an Oblivion benchmark with X2 3600 and ill agree its bottlenecked. Now the reality check is even a 3 year old A64 didnt show to be CPU bound. I use a 1280X1024 CRT and see no CPU bounds unless I play Oblivion.

It's funny you say you are CPU bound in Oblivion, since it is far more GPU bound than HL2 ever was (or is). Are you sure you're not exceeding the 320MB memory with excessive AA?

Oblivion CPU scaling:
http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/page11.asp
http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/page12.asp

I don't see much CPU bottlenecking on a 8800GTS I'm afraid. It's only evident on the 8800GTX.