Picking A Sub-$200 Gaming CPU: FX, An APU, Or A Pentium?

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Playing Skyrim isn't "casual gaming" though. For Flash-based games, all the Popcap and BigFishGames type stuff, or replaying the titles you enjoyed eight or ten years ago, an APU is all you need. That's a huge market.
 
... how you get such low scores with the FX... is there any optimizations done? some tweaking and tuning? Ok, the results will not be much better, but not so poor... i hawe a FX-8120 @ 4Ghz with HD5870 an 16Gb DDR3 @ 1600Mhz 7-9-9-24... i don't even have HD7950, but the results in skyrim are not so low... so, what am i doing right? And with skyrim i noticed, that it likes low latencies... a noticable difference between CL9 and CL7... i tried with x8 AA, but there was no noticeable difference from x4 AA...
 

it doesn't matter with 'optimizations' or 'hotfixes' the FX-Bulldozer is not a desktop chip but better for server type environment.
for gaming the FX-Bulldozer = FAIL.
I can do better with my 965BE @ 4.0GHz..
 

In your tomshardware review, using a quad core, you shouldn't expect a latency advantage. This doesn't negate the point of dual cores affecting latency more than quads. Now look at the added load you included as game load across all cores. That small amount is a gain in support of the killer NIC. Wonder if software throttling with the killer NIC would have lowered the FPS as well as lowered latency for WOW?

On your test the MMO does fall into margin of error but the TF2 test shows a gain. MMO is less dependent on latency as its a select enemy and random change damage. No latency advantage but at the same test you get higher FPS. The TF2 tho latency is huge as its about aim and latency increases the human error chance a lost shot. Lastly I pointed out software throttling using FPS_MAX 100. You must see the KILL NIC does far better than the intergrated solution alone.
Killer%20Xeno%20pro%20-%20Load%20Latency.png

On the FPS it somewhat higher for both using the killer NIC.
Killer%20Xeno%20pro%20-%20Load%20Framerates.png

With a quad under single threaded WOW you had under used cores doing the work when the killer NIC wasn't used. IE you couldn't expect a huge advantage in latency. Some would say 10ms in TF2 tho was huge. The KIller NIC couldn't possible beat a CPU unless while under heavy load at any case. Much like running current games on a dual core.

Now with the anandtech back in 2006 the single threaded games were making the single core CPU's look more powerful than the dual cores. IE the dual core at this time had a under used core.
13375.png

13379.png

13369.png

13401.png

13374.png

13371.png

13366.png

Under used CPU core so no ping advantage but what was the FPS above. Given the FPS of the other 3 its somewhat outside margin of error.
13377.png

It did some what better than the others on average in the anandtech review. On this single review you can count it as margin of error but not all 3 reviews.

Every test had a CPUs with under used cores when the test was conducted. The Firefox plugin is doing the same thing as fps_max. IE limiting fps to free up CPU cycles as I stated software freeing up CPU cycles works but mostly at FPS reduction. This said all three show an advantage but a small one. The margin of error flys out to door when you have repeated results on three reviews. Back to the article I wouldn't suggest this with a dual core or software throttling over a good quad core. Would you?
 
[citation][nom]DjEaZy[/nom]... how you get such low scores with the FX... is there any optimizations done? some tweaking and tuning? Ok, the results will not be much better, but not so poor... i hawe a FX-8120 @ 4Ghz with HD5870 an 16Gb DDR3 @ 1600Mhz 7-9-9-24... i don't even have HD7950, but the results in skyrim are not so low... so, what am i doing right? [/citation]

We actually used more optimizations for the FX than most folks as we installed the windows hotfixes.

The reason you're frame rates are higher in Skyrim is that you're not using the same demanding benchmark area we are, plain and simple.
 


Games are becoming more multi-threaded. Last I checked, WoW could use a second thread for audio processing and some other minor things. Dual-core CPUs may be better off with a good network adapter within the near future, if not today. Still wouldn't suggest this over a quad core. The price difference between an Intel dual core and an Intel quad core is probably about the price of the good network adapters so it's a moot point if you haven't bought the CPU yet but it may help upgrade a computer that already has a dual core CPU and you don't want to pay for a quad core upgrade.

There are several things you can do to offload work from the CPU. Getting a sound card with a processor is another way to help when the processor doesn't have more threads than applications, such as games, being run on it. Also, not using USB drives and such that eat up a fair amount of bandwidth over the USB ports. USB also needs a little processing power, unlike firewire.

If games begin to utilize four cores well then there won't be spare cores for most people anymore. Then more advanced NICs, sound cards, etc. might show a tangible benefit in performance. Right now, I don't see a good reason to get something like this especially considering how expensive they are. At least good sound cards have some reasoning behind them, they can improve the audio quality for anyone whom demands better than modern integrated audio.
 
[citation][nom]elbert[/nom]In your tomshardware review, using a quad core, you shouldn't expect a latency advantage.[/citation]

I don't buy that. Are you saying the Killer NIC is useless unless you have a dual core? Who runs a dual core for a serious gaming machine, are you saying they'd be better off investing in the killer NIC instead of a faster quad core CPU?

Any 'advantage' the Killer NIC showed was so insignificant that a human being could never notice it, and in some of our tests it showed a disadvantage. You can pour over a million benchmarks showing +/- 0.1% gain, but it's futile. At the end of the day it doesn't make a lick of difference unless you're crazy enough to download a ton of torrents while gaming online, and even then freeware will do the job.

If you want to convince yourself that margin-of-error results are more impressive than tangible gains from a better graphics card or CPU, be my guest.
 
I don't understand why the i5-2500K is present here. For one, its above the stated price cap. And secondly, why is it the ONLY one that's allowed an overclock? Is this some sort of foul-play here... or is it that the i5-2500K doesn't perform adequately when stock?
 


It's stated at the start of the article why the i5-2500K is there. It's there to provide a comparison between the cheap and the high end CPUs. I think that it should have been overclocked more (how many i5-2500K users only go to 4GHz? $20 coolers can go higher than that with the 2500K) to provide a better comparison and I don't get why it was only run at 4GHz.

Besides that, the i5-2500K was NOT the only CPU that was overclocked. The second to last page has overclocking benchmarks for all of the overclocking processors, all of AMD's CPUs were overclocked in it.

After ignoring these pages you go as far as to imply foul-play? You seem to be the one at foul-play. FYI, the i5-2500K will outperfom even the i5-2400 at stock and thus everything else too. Fail troll/fanboi is fail.
 

it's apparent you either didn't read the article or (more than likely) didn't comprehend the article.
it's OK we understand.
 

Yes the Killer NIC is nearly useless when you have under used cores. IE you have free CPU cycles to do the same latancy work. All you then do is free up the used cores for a few more fPS. You get a more noticeable advantage in latency with a dual core currently due to freeing up CPU cycles for the game with only 2 threads. No I'm not saying invest in KIller NIC over a quad but to compare them in latency with a dual core its needed. Your quad core latency is going to be lower in most games except single threaded games.

Freeware does the job but then your sacrifice FPS when you should just buying a quad. That is the point you either suffer software throttling, the cost of the Killer NIC, or just buy a quad. This was the point I was making before you replied. Look back at my reply to dj christian.

I am saying you need a Killer NIC with a dual core or tri core CPUs to compare it to a quad. Else you suffer up to 10~15ms higher latency and will need to throttle. What I am saying is your better off investing in a quad as they general keep your games at lower latency.

Most humans never notice the change in 10~15ms but it does make a major advantage in first person shooters.

Now with 3 reviews margin-of-error is out as its repeatedly doing the same small but noticeable increase. The scientific approach to disproving margin of error is separate tests repeating the same results. In this case a small improvements. Im not saying its anymore than small but that is the advantage a quad has over a dual core in latency. Small but huge latency in the world for first person shooters.
 
[citation][nom]DjEaZy[/nom]... how you get such low scores with the FX... is there any optimizations done? some tweaking and tuning? Ok, the results will not be much better, but not so poor... i hawe a FX-8120 @ 4Ghz with HD5870 an 16Gb DDR3 @ 1600Mhz 7-9-9-24... i don't even have HD7950, but the results in skyrim are not so low... so, what am i doing right? And with skyrim i noticed, that it likes low latencies... a noticable difference between CL9 and CL7... i tried with x8 AA, but there was no noticeable difference from x4 AA...[/citation]
For the most part, thats probably 4gb vs 16gb, haven't played skyrim, but some games blow chunks if they have to load into the hdd virtual memory.[citation][nom]malmental[/nom]it's apparent you either didn't read the article or (more than likely) didn't comprehend the article.it's OK we understand.[/citation]

just a last thought on the 2500k being used, would have been more entertaining to show the SB-E 3960x instead. see how a $200 cpu stands up to $1000+
 
[citation][nom]noob2222[/nom]For the most part, thats probably 4gb vs 16gb, haven't played skyrim, but some games blow chunks if they have to load into the hdd virtual memory.[/citation]

I doubt you'd see much difference. 4 GB is plenty, and in the odd game that needs more it's rarely more than a few FPS difference.

[citation][nom]noob2222[/nom]just a last thought on the 2500k being used, would have been more entertaining to show the SB-E 3960x instead. see how a $200 cpu stands up to $1000+[/citation]

I'd be real surprised if a stock SB-E 3960X would perform faster than a 2500K @ 4 GHz when it comes to games.
 


And I'm saying it doesn't, and the majority of evidence doesn't support that.

Even if it did, that money would be MUCH better spent on a CPU upgrade.

We'll have to agree to disagree on this one, Elbert.
 

Sorry but you missed my point that it is much better to spend on a cpu upgrade of a quad then suffer a dual or tri core bit more latency. I think you replied to my post without full understanding the discussion we was having.

I have pointed out the majority of the evidence. The margin of error in each review was on the side of the killer network cards. The toms guild was majorly on the side of the killer network card.
 


The majority of reviews don't support your theory. It's a useless placebo.

If you played a game on a machine with a Killer NIC and then on one without a Killer NIC there's no way you'd be able to notice a difference, regardless of CPU.

As I said, we'll have to agree to disagree. :)
 

Not saying you notice the difference but lower latency in first person shooters you get a higher score. Your aim will be slightly more dead on. Latency in first person shooters is worth way more than every fps over 60.

Here is another benchmark showing the killer NIC with a small advantage in FPS and ping. Not outside margin of error by itself but you must see that small amount in latency is important to gamers.
http://www.overclockersclub.com/reviews/bigfoot_killer_2100/7.htm
 


Noticing kills would qualify as a difference. I'm saying there isn't one.
 

I'm sorry but are you saying with a small lower latency you wouldn't get more kills in a first person shooter? I don't think you mean this because that is just absurd. I'm guess you mean no difference the killer NIC can produce.

A lower ping of 1 over a 5 min average with an average of 1fps higher.
http://www.bit-tech.net/hardware/2010/09/02/killer-2100-gaming-network-card/4
 


Bingo! Given that some reviews show no advantage and others show a slight advantage I'm of the opinion it's wishful thinking combined with the margin of error.

Unless you want to sample a statistically significant number of gamers using, and then not using, the Killer NIC, and compare the number of kills between both, your evidence is anecdotal and wishful at best.

As I said, we'll have to agree to disagree. :)
 

DeadHorse.jpg
 

Don't have to sample gamers as lower latency is king. Gamers in first person shooters are so worried about latency there are thousands of guides to lower it.
http://cod4guides.blogspot.com/2009/06/latencyping-guide.html

All you have to sample is review with lower latency and if they all have a slight win but in margin of error scientific method points us next to reproducible outcome. No long margin of error nor wishful thinking if its reproducible. Guess we disagree. :)
 


We can agree on that! I've been saying it for a few posts now, actually. :)

But if you ever manage prove that the Killer NIC gets gamers more kills in an actual test, send that info my way. 😀
 

Pretty much a know fact that lower latency gets gamers more kills. The higher the latency reduces reaction time as well what is being aimed at may not be where you see the object or player.

You really need not prove the killer NIC gets gamers more kills. You only need prove that lower latency gets gamers more kills.
http://cod4guides.blogspot.com/2009/06/latencyping-guide.html
Below 50 > You get great hit ratio
Below 100 > You still get a great hit ratio.
120-140 > You won’t hit a running opponent sometime
170-240+ > Horrible lag, stutter, and teleporting just disconnect and look for another server
Then prove the killer NIC does to any degree reduce latency. Every review I have see show it as being small but in the margin of error. Again tho every review shows it is repeatable which under the scientific approach mean no longer margin of error.
 


Cool that you think so! I disagree.

Say, the next time you want to debate this with me, just look at this reply and reread it. Pretty much covers it I think. :)
 
Status
Not open for further replies.