Gamers: Do You Need More Than An Athlon II X3?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

spirit123

Distinguished
Jul 13, 2009
10
0
18,510
0
This set up is wrong.You need to take all the extra money from Intel cpu system.Then use them to get better gpu for the amd system and then do all this tests.So you can compare the systems for the same amount of money for games only.And no cf or sli.
 
G

Guest

Guest
Know what they should do to help Gamers? Make a Review showing different loading times between a SSD and a HD. Every time they do a SSD review they put all that crap synthetic shit with alot of mb/s numbers that don't show us much. They focus on Windows Loading time, wich happens only a couple of times a day in a gamer's PC. Whats the real world difference between a troughput of 100mb/s and 220mb/s? I don't know. What i wanna know is if my SC2 load times will reduce to half using one of those.
 

lauxenburg

Distinguished
Feb 9, 2009
540
0
19,010
5
If you "need" more than 60FPS in any game, you need to get your priorities/life straightened out. Or someone has to hide your wallet/bankbook somewhere.
 
G

Guest

Guest
"There are some people who might get the impression that we're being unfairly hard on the Athlon II X3 440 by pitting it against the Core i7-920. In fact, the opposite is true."

Then you short the i7 a gig of ram and some PCI lanes, and compare the platforms. Seemingly just to help prove your point.

Yes, the x3 has respectable gaming performance. However, portraying them as having little difference between the two is wrong when you're crippling one of them.

With the power of today's processors, like the x3 or the i5, there is little reason to jump to the i7. Unless you are going to sli/xfire and can make use of the extra PCI lanes. The only thing the article proves is the x3 does well in the market is was designed for. Yet you don't do the same for the i7, but you still compare the two.

I'm not an i7 fanboy or anything, just hate to see all the objectivity leaving toms. The articles lately seem like you pick a title, then go make a test to prove it's true.
 

redgarl

Distinguished
Something I was saying since the actual launch of the core i7... sincerely, there is no need to spend the extra money for so little result. You should always invest in your graphic solution if you are into gaming.
 

Onus

Titan
Moderator
This article did not ask the question "is Athlon II X3 as good as i7-920 for games?" Of course we all know it isn't; an article wasn't needed to prove that point. But, that was not the question; which was "is Athlon II X3 good enough for games?" The answer to that question is, "Yes." So, nicely done.
 

cleeve

Illustrious
Moderator
[citation][nom]Smurfzilla[/nom]"There are some people who might get the impression that we're being unfairly hard on the Athlon II X3 440 by pitting it against the Core i7-920. In fact, the opposite is true."Then you short the i7 a gig of ram and some PCI lanes, and compare the platforms. Seemingly just to help prove your point...

...I'm not an i7 fanboy or anything, just hate to see all the objectivity leaving toms. The articles lately seem like you pick a title, then go make a test to prove it's true.[/citation]

Well, the difference between 3 and 4 GB is meaningless when it comes to game performance--yet using triple-channel memory will help the i7's bandwidth, something you forgot to mention. Or are you suggesting it would have been more fair to give the i7 6 GB of RAM?

As far as using a second slot at 8x PCIe instead of 16x PCIe, that's also next to insignificant. We have looked into both of these factors in-depth in previous reviews and they make no performance difference to speak of, certainly nothing that will affect the outcome of our tests.

You need to look into what you're suggesting is a problem before you throw an accusation around. As it stands, your accusation doesn't make any sense.

Furthermore, you say we're trying to prove a point, and that indicates bias on your own way of thinking. We didn't start out with any preconceived notion of what would happen, we have no point to prove other than reporting our findings. We simply asked a question and then saw where the tests lead us.

Assuming we have a point to prove suggests you've taken offense at our data. And if you're taking offense at test result data because it doesn't fall where you'd like it to, you might want to look in the mirror when pointing fingers about objectivity. ;)


 

kitekrazy1963

Distinguished
Feb 1, 2010
89
0
18,630
0
One other thing I got out of those benchmarks is that it confirms that I still think World in Conflict sucks.

Replace it with GTA4:EFLC. I want to see those benchmarks.
 

Fokissed

Distinguished
Feb 10, 2010
392
0
18,810
13
[citation][nom]kelemvor4[/nom]+1 there, friend. I don't see why someone would pay for a high end GPU and skimp on the monitor or CPU; yet you see people with gtx 480 (or even 5870) trying to game athlon2 x3 cpu's to save a few bucks; or even trying to game on smallish 22" monitors. It just doesn't make sense to me to blow that kind of cash on a card and leave yourself with a huge bottleneck requiring low settings (such as no fsaa).Max settings in the game, 16xQ or 32x fsaa etc @2560x1600 (or 1920 at a bare minimum) and see where you get with even the i7 920 at stock. CPU bottleneck city is where you'll be, that's where.[/citation]

32xFSAA will bottleneck the CPU at 2560x1600???
 

skora

Distinguished
Nov 2, 2008
1,498
0
19,460
56
I'd like to see how much of a performance increase you'd get from a 3 to 4 core setup with a dirty/unoptimized OS. With so much focus on real world performance, just benching the raw power of a chip is great if home users are running a raw OS. Might be something to test and if there is a notable difference, add a dirty OS bench to one of the clean install games for comparison in the bench suite. Might be worth the extra few bucks for someone that isn't OCD about their OS for performance a quad can offer.
 

Proximon

Illustrious
Moderator
Very good article. I think you should have added some comment about longevity however. Lately around here there has been too much focus on building for today, ignoring tomorrow. Anyone who has been building rigs for a few years will know that anything built today will eventually become obsolete, and it is only a question of when.
 
Any decent person, who calls himself a gamer, won't get less than a 75Hz capable monitor... Myself play at a 85Hz CRT monitor (yes, it weights a ton, but hell looks like heaven!).

Cheers!
 

pinkfloydminnesota

Distinguished
Mar 4, 2010
181
0
18,680
0
Funny you'd stress the importance of minimum frame rates seeing as you usually ignore this data in most video card reviews, focusing on avg. framerates to the exclusion of all else.
 

axe1592

Distinguished
Mar 29, 2010
232
0
18,710
23
This is a great article and very educational. Im a budget gamer so Im always interested in "bang for the buck" info.

Im running an Athlon II X3 435, overclocked to 3.4 paired with an overclocked 4850 and Im getting in the mid 40's on Crysis' benchmark with Med/High settings at 1680x1050, so I can attest to its potency. Especially considering the CPU and GPU together cost me $160!

Bottom line is you dont have to spend $3000 to have a capable gaming system. But if you have the cash, there are definite benefits to spending $3000.
 

skreenname

Distinguished
Mar 20, 2009
55
0
18,630
0
The part where you mention how the type of game can determine the acceptable FPS, that's totally true.

When I only had my laptop I could only get 10FPS with the settings on low, ( except in caves, I got 30FPS there ) in World of Warcraft.
And that was pretty OK.
Not the best case scenario, but definitely playable since it was WoW.

But if I tried to play much else I would get the same or worse framerate and it was playable.
 

blll_gates

Distinguished
Apr 22, 2010
15
0
18,510
0
It wasn't mentioned that the $200 midrange CPU's (i5-750, IIx4 955) performs better in most games than i7-920 & i7-950 CPU's. That really the compelling part. The new 890FX chipset has USB3 controller and SATA6/gbs built-in, multiple PCIex16 lanes, and are future proof. I don't take sides with intel vs amd but amd's offering is the best value. I would have built a i5-750 gaming rig if Intel didn't try to be greedy and cut bandwidth on the PCIex16 slots on their p55 motherboards so people had to buy a i7 with x58 chipset.
 

mattmock

Distinguished
Sep 28, 2009
59
0
18,630
0
[citation][nom]lauxenburg[/nom]If you "need" more than 60FPS in any game, you need to get your priorities/life straightened out. Or someone has to hide your wallet/bankbook somewhere.[/citation]
I need 0 fps in my games, they are a luxury. Having >60 framerates just makes the the playing experience cleaner and more enjoyable. Enjoyable in the sense that when you have 120 fps, you never notice your framerate at all.
Criticizing people for wanting high fps is like criticizing people for buying music you don't like, the value is clearly subjective and will vary from person to person. Likewise people don't all perceive fps the same way.
 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
32
[citation][nom]Smurfzilla[/nom]"There are some people who might get the impression that we're being unfairly hard on the Athlon II X3 440 by pitting it against the Core i7-920. In fact, the opposite is true."Then you short the i7 a gig of ram and some PCI lanes, and compare the platforms. Seemingly just to help prove your point.Yes, the x3 has respectable gaming performance. However, portraying them as having little difference between the two is wrong when you're crippling one of them.With the power of today's processors, like the x3 or the i5, there is little reason to jump to the i7. Unless you are going to sli/xfire and can make use of the extra PCI lanes. The only thing the article proves is the x3 does well in the market is was designed for. Yet you don't do the same for the i7, but you still compare the two.I'm not an i7 fanboy or anything, just hate to see all the objectivity leaving toms. The articles lately seem like you pick a title, then go make a test to prove it's true.[/citation]

But the i7 has a clear lead despite all the so-called "crippling"
 
G

Guest

Guest
Nice shot! But we know that ATi's HD5000 series somehow has the same shotcoming as AMD's CPU: disadvantage in multi-targets rendering. How about try another shot by using nVidia's GTX480/470 cards?
 

Orumus

Distinguished
Mar 23, 2010
10
1
18,515
0
[citation][nom]Netherwind[/nom]by that logic they should have overclocked the 920 because that's what most will be doing when they use that processor for gaming..[/citation]

I agree with you, when doing a gaming article I think an "overclocked" section should be manditory. This is how they compare stock and this is how they compare with everything tweaked to max. I think that is a very helpful metric in an article comparing CPUs/GPUs . Also although I know the point of the article was to see if an X3 bottlenecks a GPU in modern games, something I would have liked to see is these CPUs matched with "budget" GPUs such as a 5770.
This article hit me square in the chest seeing how I just built a new budget gaming rig based on the X3 440 and HD5770 And for the money its a great system. and although I wasnt lucky enough to get a CPU with a functioning 4th core I do have it running at 4.13 GHz on air (29C load)and the GPU at 950 Mhz and Mem 1420 also on air these parts have proven to be fantastic for gaming as I run resolution at 1280 x 1024 everything maxed with FPS for games such as Bad Company 2 60+ FPS avg
 

njalterio

Distinguished
Jan 14, 2008
780
0
18,990
1
[citation][nom]Don Woligroski[/nom]If you've ever had the chance to see a demonstration of movie playback at your local home theater electronics outlet, you might have noticed that movies seem a lot smoother than they do in theaters on some of the displays. This is because many modern televisions can modify the video, smoothing it out with anti-judder technology, and play it back at 120 Hz (or 120 FPS). Most folks easily notice the visual difference when movies are played back at 120 FPS with anti-judder enabled, which goes to show the human eye can perceive a lot more than 24 FPS. In fact, research suggests that human beings can perceive more than 200 FPS.[/citation]

Don, while you usually are spot on in your graphics card analysis I have to call B.S. on this section of your write-up.

The anti-judder isn't resolving issues with the frame rate; it's making up for the LCDs motion handling via sample and hold. LCDs use transistors while CRT and plasma displays use phosphors. Relative to phosphors, transistors respond much less quickly causing LCDs have motion issues that CRTs and plasmas do not.

So you might want to reconsider your statement "...the human eye can perceive a lot more than 24 fps" as the 120 Hz displays are not 120 Hz for the sake of our ability to see 120 Hz, but so that LCDs can have better response times that lead to less judder. A 60 Hz plasma or CRT will look just as smooth as a 120 Hz LCD (Disclaimer: Sometimes the plasma will actually look smoother because the Auto Motion Plus feature that comes with 120 Hz LCD displays causes artifacts of its own). Take a look at the new LCD displays. Notice how the response times of all 120 Hz displays are lower than the 60 Hz LCD displays? Look at the plasma display specifications....notice how they do not have a response time?

So what does this mean for PC gaming? Get a good LCD monitor that has a response time of 2ms to 5ms and your gaming will be smooth so long as you are rendering around 60 fps (this is the flicker-fusion point), and enable vsync if you are rendering above that. I personally play newer games closer to 30 fps due to my graphics hardware/settings. At 30 fps I can notice some stuttering, but it is not enough to make the hassle of installing a new graphics card worth my while. I have no issues whatsoever with input lag.
 
Status
Not open for further replies.

ASK THE COMMUNITY