Gamers: Do You Need More Than An Athlon II X3?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

zodiacfml

Distinguished
Oct 2, 2008
1,228
26
19,310
unbelievable..
the point here should still base on the 60 FPS cap when using ordinary displays. the FPS difference between the systems is only around 10 FPS including minimum frame rates. it does not justify the cost of a higher end machine.
i am tired of CPU reviews in games wherein they go like 100+ frame rates beats the 55+ FPS of a lower end cpu.

if someone games on true 120 Hz displays then I recommend the overclocked i7-920 in Crossfire. i believe, we can distinguish frame rates up to 85 Hz/FPS.
 

doron

Distinguished
Feb 15, 2009
553
0
19,010
"At 2560x1600, however, the graphics subsystem dominates the processor. At this resolution, the Athlon II X3 440 system has a clean win. To whom does this apply? The limited few with 30" displays."

Guys who get 5870x2 will probably have the money for a hefty eyefinity setup which can easily produce more than 2560x1600 resolution, I don't see the logic in using even a 1920x1200 monitor for such a healthy and expensive gpu configuration.
 
G

Guest

Guest
well the real reason to get an i7 920 is not have to buy another cpu in a year or 2, athlon x3 is a good processor but will you bet it will be a good gaming processor the next year ?? I would certainly not. What about in 3 years ??? I will bet the i7 will keep up in 3 years instead the people that buys an athlon x3 today will be spending the money that they don't spend today in 1 or 2 years, making it a totally anoying situation having to discard your processor in just 2 years.
 

anamaniac

Distinguished
Jan 7, 2009
2,447
0
19,790
[citation][nom]Jarmo[/nom]I'd guess at least 90% of users never overclock anything.To be fair though, probably 90% of Tom's readers do.[/citation]
What about those of us who did overclock and don't anymore? :)
[citation][nom]cashews[/nom]It would be great to see Battlefield Bad Company 2 tested in these reviews as well.[/citation]
BF:BC2 doesn't take much to play, don't worry. Hell, I play it at 6144x1152 with a mix of medium/high on a single 5770 and a stock-clock i7 920. A Athlon II x3 would be more than enough.
The lunge for a high end quad core may be more justifiable if you like to play modern RTS games though, but not with FPS games.
 
G

Guest

Guest
Obviously for a crossfire 5870 setup you wouldn't choose a 100$ CPU but a 5850 is a reasonable choice. But why would you run those tests without AA or AF ?? Who games without them these days ?? If turned on I think results would be even closer between intel and amd + 5850...
 
Hey lets get a Daewoo Lanos and compare it to a 911 Porsche ... take them both around the racecourse for a spin.

The Porsche whips the Daewoo in pretty much every category with the exception og LUGGAGE SPACE and possibly ROAD NOISE.

On that basis, I recommend that you buy the Daewoo if you want to race casually ... but if your serious ... just get the Porsche.

Don .. .where were you going with this article??
 
Very interesting. I wonder if the added core of an Athlon II X4 would show any measurable improvement, or is it just the current clock for clock Intel superiority over AMD that is making the biggest difference?
 

kelemvor4

Distinguished
Oct 3, 2006
469
0
18,780
If you've only got an ati 5870 or similar then you might get away with an athlon2 x3, but until you get into the 3.5ghz i7 range you've got a significant bottleneck at the cpu if you've got a Fermi based card. Can you tell? Absolutely, assuming you run a decent res (not 1280 or some other small res) because when you remove the cpu bottleneck and run a gtx 480 you can get 40 fps (or more) with all the settings maxed in pretty much any game now.

Sure an athlon2 x3 will be good enough to game at lower res or with less than max settings. So technically you don't "NEED" more. It's a matter of want vs need I suppose.
 

Otus

Distinguished
Nov 23, 2009
29
0
18,530
The graphics settings for dual 5870 aren't very realistic. No AA? No AF? Setting fsync off for benchmarking a single component makes sense, but in this case it should have been on to appropriately cap the useless FPSs.
 

2shea

Distinguished
Oct 11, 2008
58
0
18,630
the irony of this story is epic, only ppl with a screen so big they always have a monster pc with the most powerfull oced cpu in it will benefit from a x3 rather than the 920 :')
for the rest of us with not so big screens and only 1080p screens, we too don't need so much cpu power for gaming, although the intel does give us a bit more its not that big of a difference.
The only advantage a x3/4/6 multicore cpu has over dual core is the better multitasking and better multicore oriented programs like winrar, video, photo and sound editting...
 
G

Guest

Guest
Most people do not overclock, but if you are going to run a Xfire setup with a low end cpu, you are going to overclock!

Would love to have seen how they both would performed with a easily attainable OC on air in at least the xfire setups.
 

fozzie76

Distinguished
Mar 30, 2009
42
0
18,530
Great article but who buys a 440 and runs it a stock speeds? These babies beg to be overclocked! 3.8 on air is easy as cake using the $29 N90 cooler. Will a 440 running at 3.8 or 4.0 take out a stock 920?? Do a part 2 and tell us!
 

banthracis

Distinguished
You guys really should have done it like the legion hardware article of the i7-920 vs Phenom II x4 w/ a 5970.
http://www.legionhardware.com/articles_pages/cpu_scaling_with_the_radeon_hd_5970,1.html

They demonstrated that w/ overclock/underclocks from 2.0-4.0ghz, that the Intel processors benefit very little, while the Phenom II while starting behind, quickly catches up to the Intel processors.

If the Athlon II x3 can demonstrate the same ability, that'll throw a huge curve ball into your conclusion.

Also, seriously, testing real world scenarios w/o using AA/AF? /fail
 

cutterjohn

Distinguished
Apr 16, 2009
37
0
18,530
I spent around $300 for my last desktop CPU used in a build, and am quite willing to spend around that again, especially considering that I use my computers for more than gaming and Core represents the best value ATM.

Also note, I do NOT overclock my hw unless it comes that way from the mfg as I prefer longevity to a slight increase in performance at least as far as I've observed with OCing and real world applications, especially when it relates to gaming. i.e. the gaming aspect is from OCing a GPU while OTOH I will admit there are instances where it might be nice to have the increased clock cycles for other non-gaming applications, but again overall I prefer my hw to last as long as possible since I usually buy and build something that will last at least 3y. PSUs(these are remaining about the same) and hdds(these are getting better) tend to be sore spots for me as I tend to go through those more frequently and especially with PSUs I wouldn't wish to stress them additionally by the usual drastic power requirement increases that OCing usually yields.

Anyways as to spending $300, if microcenter keeps up their deals through sandy bridge, I'm only looking at a $200 CPU anyways which is $100 less than my last(4800+ X2), so, to me, it's a bargain. Even @ $300 Core seems more than worth it to me as if AMD had any hope of actually competing on a core efficiency basis you can believe that their CPUs would again be priced just like Intel's like they were back when I got the 4800+... ATM AMD has nothing to work with other than price, and lulz clock rate, how ironic, but I guess the wheel turns. (I knew that this was going to happen to AMD when I saw them spending almost nothing on R&D, and several or their arch staff leaving... plus, actually, most of their k7/8 stuff come from IBM anyways...)
 

kelemvor4

Distinguished
Oct 3, 2006
469
0
18,780
[citation][nom]banthracis[/nom]Also, seriously, testing real world scenarios w/o using AA/AF? /fail[/citation]

+1 there, friend. I don't see why someone would pay for a high end GPU and skimp on the monitor or CPU; yet you see people with gtx 480 (or even 5870) trying to game athlon2 x3 cpu's to save a few bucks; or even trying to game on smallish 22" monitors. It just doesn't make sense to me to blow that kind of cash on a card and leave yourself with a huge bottleneck requiring low settings (such as no fsaa).

Max settings in the game, 16xQ or 32x fsaa etc @2560x1600 (or 1920 at a bare minimum) and see where you get with even the i7 920 at stock. CPU bottleneck city is where you'll be, that's where.
 

kilthas_th

Distinguished
Mar 31, 2010
40
0
18,530
What was the point of using a 790X board in lieu of a 790FX one? I know you moved the second card in the Xfire setups to an x8 slot on the X58 board to suit, and I also know that it probably doesn't make a huge difference, but it just seems like the article could have waited or day or two longer if you needed to wait on a board to arrive. Obviously, I could understand if you were comparing an i5-750 and P55 setup to the 790X, both having limited lanes, but in this case, it doesn't seem to make sense.
 

thejay_za

Distinguished
May 27, 2010
1
0
18,510
If money were not a problem I'd have a bunch of PCs, including a home PC and a gaming rig, but as it stands I can't afford that.

I have 1 PC that does everything- everything at once mind you, and it's an i7 920, on an Intel x58, with 3GB DDR3, and a GTX295, running WinXP SP3 32bit.

At the time the II x3 wasn't available in south africa, the x4s were tho, and I wanted to buy one, I'm a huge AMD fan - Havn't dugg intel for a long time, but the intel 920 cpu was the same price, and the AMD MBD was more expensive by about $100. So I paid about $300 each for MBD & CPU, $100 for the RAM $520 for the GTX295 earlier this year. Not much for my only PC. OK I have 2 laptops as well, but they've got jobs to do, the 10" Acer ONE lives with my canon EOS 7D in it's bag as storage, and the LG S1 is only used as a mobile filesharing archive at LANs

So same price CPU, but much weaker, that needed a more expensive board that couldn't run my (at that time) 2 nVidia 8800GTs, so why pay more for less that wouldn't support both my graphics cards?

Anyway now I play anything at 1920x1080 graphic options wide open, while PS3 Media Server is live transcoding & streaming movies or TV series to the TV room (mother is a sci-fi freak). Basically I can do anything I like at any time in the background, antivirus scans, defrag, formatting drives, writing disks, transcoding & muxing, PMS, downloads, whatever, and there's no effect on gaming performance :)

Incidentally, in the 9 or 10 months I've had this machine, not 1 single windows crash,no bluescreens, jack. & this is the same windows that was running on my AMD Athlon64 3600, plugged the boot drive into the new board, booted up, & loaded the new MBD sound, & NIC drivers, no reinstall and it's been running very fast ever since :)

So yeah I guess it boils down to what you want to do, as a pure gaming rig I'd go X3 as well with a nice cheap inno GTX 295,and not bother with over clocking (sorry guys). But I need the background grunt so the 920 was a clear winner.

It's like bikes, I roadride, deal with rushour traffic & do track days so I went with a 2009 Fireblade - for the extra grunt ;) If I only did trackdays I'd have saved huge & bought a 675 Daytona & pimped it out
 
Interesting article, I am using a tri core my self a cheap 8250e than I am overclocking. They hold better than most sub 3ghz dual cores while quads are a good choice (AMD) due to their low cost and even more so for sub 95w samples. I avoid 140w samples.
 

dgingeri

Distinguished
you guys didn't test World of Warcraft. Currently, it is extensively CPU intensive, and I don't see that changing. Also, currently, you could run this with as little as a 9800GT at maximum detail levels at maximum frame rates, if you have a strong enough CPU. (This may change with the next expansion and the new water effects. I've heard that it takes a good DX11 card to run those effects.)

I've seen the effects of powerful CPUs. Even a base clocked Core i7 920 won't give maximum frame rates in a raid. It must be overclocked to at least 4GHz to get maximum frame rates, and most dual core machines won't even give 10fps during a 25 man raid or in some areas of a high population town. (The more player characters on the screen, or the more spell effects going on, the more CPU horsepower is needed.) WoW also strongly favors Intel's architecture. So, an Athlon II X3 wouldn't be able to keep up properly unless the player spends most of his time going solo and staying out of major towns. That makes the game really boring.
 

B-Kills

Distinguished
May 14, 2010
18
0
18,510
I got the AMD Phenom II x3 720 B/E, I've gotten really good frames and fast processing with it paired up with an XFX Radeon HD 4890 XT, I have tried FC2 on max settings @1680x1050 better frames than Athlon x3 not a big difference though. the athlon x3 is really good for the price and quality, if you've got the money to throw out go for the intel or a higher end AMD CPU.
 

cadder

Distinguished
Nov 17, 2008
1,711
1
19,865
1. I agree that the processors should have been overclocked in this comparison. I think most people that would build machines like this would overclock them as part of the build.

2. How close would the i5-750 be to the i7-920? It seems that there is still a pretty good difference in total cost which would have priced the single CPU/single GPU combination still lower.
 
Status
Not open for further replies.