Can Your Old Athlon 64 Still Game?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I bought a s939-based X2 3800+ in 2005, with 1 Gb of RAM (in a single stick), and a Geforce 6600 (not GT). I upgraded the RAM to 2x1 Gb right after that, and used to tweak latencies quite a lot (I now leave it at 3-3-3-5, 1T).
It served me rather well for quite some time (I'm not a gamer), but when I got a 1680x1050 screen recently, the Geforce cried 'ugh'.

And the Radeon HD4850 came out.

My CPU is 10% slower than the 4200 shown in benchmarks here; still, for those times I play, it now pulls its weight quite handily.

I must admit I removed most background tasks from the OS, probably freeing like 15% CPU time...
 

Kohlhagen

Distinguished
Mar 15, 2006
115
0
18,690
I still use the Asus A8R32-MVP Deluxe with an Opteron 165 (1.8) OCed to 2.7 and 2x1GB DDR400 CL2-2-3-2. I have an extra A8R32-MVP and would love to send it to you guys are going to have a part 2 for this..

I've also got a 3870, so I've always been curious how crossfiring it on my aging system would show any performance..

 

KRayner

Distinguished
Oct 24, 2008
6
0
18,510
@ mottamort

Yeah it does suck, although I'm not quite sure where you come with the 'ATI cards are way more expensive than their Nvidia counterparts' theory. Looking at Frontosa's pricelist now and a Asus 4870 costs R600 less than a Asus 260. I've always supported the best bang for the buck solution. I've had Intel, AMD and even Cyrix cpus's as well as SiS, 3Dfx, Nvidia & AMD video cards. It's always about price vs performance for me. I do love the fact the ATi forced Nvidia to bring down their prices BIG time just to compete, folks like us really appreciate it ;-)

mottamort, where in SA do you live?
 

KRayner

Distinguished
Oct 24, 2008
6
0
18,510
@ Kohlhagen

Have a look at the VGA charts to make a decision however in my experience multi-card setups are always a bad idea. Biggest problem being game compatibility vs performance gained for the cost. I've always found it better to replace the aging card for a few bucks more and have a guaranteed performance increase rather than a negligble increase in some titles. The multi-card idea is a good one but the reality is unforunately not as good :-(
 

KRayner

Distinguished
Oct 24, 2008
6
0
18,510
@ Kohlhagen...again

Look here quick: http://www.tomshardware.com/charts/gaming-graphics-charts-q3-2008/compare,798.html?prod%5B2116%5D=on&prod%5B2120%5D=on

It's a comparison of 3870's in Crossfire vs a single 4850. Looking on NewEgg the average diff between the two cards (1 3870 vs 1 4850) seems to be around $40. Now the performance is quite similar however bear in mind that in some titles the performance deficit for the CF setup is quite notable, highlighting my problems with this approach.

Just my 2c ;-)
 

Mottamort

Distinguished
Oct 23, 2008
201
0
18,680
@ KRayner. Stellenbosch, and I dont have the luxury of online stores, seeing as the banks are unwilling to allow me credit ^_^. So when I say the prices are different i'm referring to the actual crappy pc hardware SHOPS which are a complete ripoff. But i'll have a look at the Frontosa's pricelist. I've been looking at Rectron prices, although not recently.
 

crowheart27us

Distinguished
Oct 5, 2007
117
0
18,680
Well I'm about to give this processeor(Athlon x2 5600+ lol)to my brother but after seeing this article I'm having second thoughts! Actually after I show him this he should be happy with the budget rig I'm going to build for him. Gives me a reason to get my new cpu.
Currently getting 11528 3dmark06 on my 8800gt sli system with this processor.
 

KRayner

Distinguished
Oct 24, 2008
6
0
18,510
@ mottamort

I stay in Kraaifontein so not too far away geographically speaking. I find the Gigabyte GFX prices are very high, I was quoting on a Leadtek 260 which is the exact same card as the Gigabyte but cost (before price increases) R1000 less ex. vat. Currently it's a bad time to buy any tech here in SA, will be waiting until next year before getting a new GFX card.
 

coulsond

Distinguished
Oct 24, 2008
1
0
18,510
forgive my ignorance but how is the amd athlon 64 4000+ single core directly comparable to an athlon 64 x2 4800+ (both 939 core)? I have teh dual core version, looking at the tests such as crysis, if mine is dual core, surely it would be better as the game is utilising multithreading. I am curious to know as I have 2 x 6800gt's in sli and was wondering whether to upgrade to a radeon 4850. Cheers.
 

snarfies1

Distinguished
Dec 31, 2007
226
0
18,680
Up until a few months ago I was using an 939 Athlon 64 (Venice) 3000. It worked fairly well for the games I played the most (Civ 4, Simcity 4). Gothic 3 ran fairly well so long as I used lower settings. I upgraded to a Q9450 as soon as it came out, and Gothic 3, using the same video card, ran even more smoothly with all of the setting cranked to the max. So yeah, it can make a difference.
 

computerninja7823

Distinguished
Sep 26, 2008
70
0
18,630
this is kinda not related but i gotta throw my two cents in...i was able to play half life 2 maxed out at 1152x864 with no lag on a geforce 2 and a p4 at 1.7ghz!. old school parts can kick some bootay! a buddy of mine was able to half-life 2 episode 2 maxed at 1024x768 with a single core athalon at 2.4ghz or something with a hd 2400 pro!
 

malveaux

Distinguished
Aug 12, 2008
372
0
18,780
Heya,

Way too much stress on the whole "higher clock" to make the stronger GPU's run games well. And AMD5000+ plays all games, including Crysis & COD4, on my 42" 1080p HDTV on my 8800gt 1gig overclocked at 1920x1080 with good frame rates. It's not 40+ "ALL THE TIME" but it's playable and nice. When I drop it down to 1680x1080 on my 22" LCD, it's completely smooth and no hiccups. And this is at High settings.

The single core CPU's are long dead. Yes. They bottleneck the crap out of new GPU's.

But you certainly don't need some $150 CPU to run it. A $40 brisbane will run 'em. And the `marked' significance is not accurately described here on that.

Cheers,
 

Worf101

Distinguished
Jun 25, 2004
498
0
18,780
Hmmm confirms what I've suspected. I'm running an FX-60 with a 3870 and I can handle most new games on reduced settings. My primary game "IL2 Sturmovik 1946" is single threaded anyway so I've always been able to run it well on XP Pro.

Next year when the AM-3's are out I'll build a whole new rig and bequeath my current one to my son. Hopefully AMD will deliver da goods and I can buy "Crysis" and "FarCry 2" and mash em up to the max.

Da Worfster
 

pauldh

Illustrious
To All - Thank you for the comments. Hopefully many readers will benefit from the article and the open discussion.

Thanks for mentioning the Opterons. Great dual-core 939 chips if they can still be found. Knowing every option to search helps if people are hunting for a 939 dual-core CPU. Opteron 185 (and the unlocked FX-60) being the top 939 processors out there.
 

ricstorms

Distinguished
Jul 24, 2007
6
0
18,510
Very nice article. Another thing to consider with a socket 754 processor is it uses single channel memory, providing a further bottleneck for performance. It would have been nice to see on old FX processor give these games a try on the single-core side, but they are near impossible to get on ebay for less than $500 or so. I remember getting Oblivion to run decently with a 3800+ x2 and two 7600GTs in SLI on my old Abit nForce 4 board. It would have been funny to see an old Pentium D in the mix, just to give the AMD fanboys something to feel good about.
 

pauldh

Illustrious
[citation][nom]coulsond[/nom]forgive my ignorance but how is the amd athlon 64 4000+ single core directly comparable to an athlon 64 x2 4800+ (both 939 core)? I have teh dual core version, looking at the tests such as crysis, if mine is dual core, surely it would be better as the game is utilising multithreading. I am curious to know as I have 2 x 6800gt's in sli and was wondering whether to upgrade to a radeon 4850. Cheers.[/citation]
To clarify, if wanting to compare single-core vs dual-core on an equal playing field, we would compare your AMD Athlon 64 X2 4800+ to a AMD Athlon 64 4000+ as both have the same clock speeds and amount of L2 cache (per core).

Back in older games, these two performed equally, when now in newer games, yes your X2 4800+ (being a dual-core) is a way more capable gaming CPU.
 
G

Guest

Guest
this article isn't bad. Though it does put too much emphasis on clock speed for the dual core procs. It proves that in order to cpu bind a dual core cpu system (of pretty much any speed) you will need a radeon4850/9800GTX+. I assume no one is running 1024x768 anymore which means it isn't even worth testing this resolution.
 

pauldh

Illustrious
[citation][nom]malveaux[/nom]Heya,Way too much stress on the whole "higher clock" to make the stronger GPU's run games well. And AMD5000+ plays all games, including Crysis & COD4, on my 42" 1080p HDTV on my 8800gt 1gig overclocked at 1920x1080 with good frame rates. It's not 40+ "ALL THE TIME" but it's playable and nice. When I drop it down to 1680x1080 on my 22" LCD, it's completely smooth and no hiccups. And this is at High settings.The single core CPU's are long dead. Yes. They bottleneck the crap out of new GPU's.But you certainly don't need some $150 CPU to run it. A $40 brisbane will run 'em. And the `marked' significance is not accurately described here on that.Cheers,[/citation]
Keep in mind, your X2 5000+ brisbane is a 2.6GHz dual-core, hardly a slouch by any means even at its stock speeds. It is far more capable than if it were running 1.8-2.2GHz like the X2 3800+ to 4200+. The data points out there is a large potential difference between lower clock speeds and higher clock speeds in these dual cores. Just how much, will vary depending on the game, the resolution, fsaa level, and the GPU you pair it with. Thanks for the comment.
 

bourgeoisdude

Distinguished
Dec 15, 2005
1,247
43
19,320
[citation][nom]kobyhud[/nom]...I assume no one is running 1024x768 anymore which means it isn't even worth testing this resolution. [/citation]

I run 1024x768 on my 22" Widescreen. I truley don't care that much about increasing the resolution vs. increasing the detail settings. In some games there is a bigger difference than others, but I find that increasing the resolution doesn't help the detail that much to me except for some of the most recent games.
 

pauldh

Illustrious
[citation][nom]bourgeoisdude[/nom]I'm running the same mobo in the test using an old Athlon 64 X2 4400+ with the GeForce 8800GTS 320MB and I'm content.[/citation]
That's still a capable rig. With that mobo, you can disable one core in the bios and try your X2 4400+ as a single core. Easy and kinda fun if you want to see just how much better off you are in a certain game. If this system is used again for testing, that may be the method used vs swapping CPU's.
 

timaahhh

Distinguished
Nov 5, 2007
279
0
18,790
[citation][nom]neiroatopelcc[/nom]But your opteron cpu still limits the modern graphics cardshttp://en.wikipedia.org/wiki/Video_card . Two years back I bought my 8800gtx, and realized it wouldn't come to its full potential in my opteron 170 (@ 2.7). A friend with another gtx paired with an e6400 chip (@ 3ghz) scored a full 30% higher in 3dmark than I, and it showed in games. Even in wow where you'd expect a casio calculator would deliver enough graphics power. In short - ye ddr still work if you've got enthusiast parts, but that can't negate the effect a faster cpu would give. At least at decent resolutions (22" wide)[/citation]
LoL at casio calculator.
 
My wife runs an Athlon 64 4400+ with an X1950 PRO 512MB and 2GB DDR (PC3200). Though I personally wouldn't want to play Crysis with it, it runs Warhammer Online pretty decently (at least she never complains about it). There's definitely some use left in these older systems (WoW and WAR for instance). But for the latest greatest games, system upgrades would benefit a lot.
 
Status
Not open for further replies.