Picking A Sub-$200 Gaming CPU: FX, An APU, Or A Pentium?

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Ya stock. Ill run the full 5 minte bench and send the frap file. The log file output from civ v makes no sense. 3 tests looking at that file didn't change but maybe 10 pts, even with fraps showing the difference.
 
now about the AMD FX-8120 you have to remember that it has not reached its full potential because windows 7 is not based on using dual cores in the processor it only focuses on performance per core just like Intel in windows 8 however this shall be fixed and we will see who is the winner then.
 
I'm kind of over the whole over clocking thing. I have seen a couple people totally kill perfectly good chips by trying to over clock. For one thing I think both Intel and AMD are trying to overuse their silicon quality assessments and maybe tagging chips with higher speed then they should. Maybe its just that a lot of people just do not properly verify that the over clock settings are truly stable? For the cost these days I have been just paying for most of the speed and over clocking less. AMD is definitely not doing well with their APU's. To me the APU is just another combo chip that has too many limitations.
 
excellent article, much appreciated..
and me personally have 'almost' three of the CPU's listed.
i5-2500K (Z68) for gaming, i3-2100 (H61) as HTPC and 965BE (AM3+ 990XA) for main usage and some gaming as well.
none of my units disappoint.

as for the FX-Bulldozer, MEH @ best even with the 'hotfixes'.
hope that the stepping revision will help and the FX-4170 replacing the FX-4100, but it won't be much help.
how about Piledriver,,, will that be the 'sidegrade' for me to make from my 965BE.?
or will it be strictly Intel from then..?

I do however route for AMD to be successful.
 
Talk about Price/ performance looks like AMD isnt anymore, even the new Radeon arent cheap.

Why cant they just sell their bulldozer much cheaper, at least u still able to affect sandy sales. If they sell their flagship bulldozer @ price near i3 2100 nobody would complaint.
 
I do appreciate the look at overclocked results later in the testing, but to compare the 2500k overclocked to 4.0Ghz during the first few tests is a bit disingenuous. At stock speeds, it would test a bit lower, though likely would still remain at the top of the charts. But to have it be the *only* overclocked processor in the first tests is against the rest of the testing procedures, and doesn't show the actual (slightly lower) performance that it would be at with stock multipliers. It would probably be even closer to the 2400, given that the only differences are the multiplier at stock settings.
 


Not really, if you read the accompanying text and put things into context. It provides a valuable, constant reference point to compare: is it worth it to pay more and overclock, or will a stock sub-$200 CPU do the job? In the case of the i5-2400, the 2500K at 4 GHz is barely better.

That's useful info, and there's no nobler cause than adding more points of reference for an informed comparison.
 
I'm still trying to get the "T" chips added to my points of reference, but Google isn't being very friendly; about all that's out there (accessible from work anyway) are some Passmark scores, which I consider near useless. In an upcoming build with a pretty inflexible 300W power ceiling, that 30W difference can be another tier on the GPU charts.
 


What this network card does is off loads latency processing. If you don't have an under used core to process the latency with the dual cores you will see on average of about 10~15 ms increased latency. Extra time your OS has to wait for the 2~3 cores pushing fps to get to latency handling. Many gamers will limit their FPS to decrease their latency. In other words you have 3 ways to get your best latency which are under used core, killer network card, or software limiting FPS. In counter strike source this is achieved by fps_max 100 which frees up the CPU some what for latency handling. These two network cards have a processor for the latency handling.
http://www.newegg.com/Product/Product.aspx?Item=N82E16833189003&Tpk=killer
http://www.newegg.com/Product/Product.aspx?Item=N82E16833342001
 


I agree wholeheartedly with throwing in the slightly more expensive CPU to show the comparison, I just don't agree with showing it overclocked vs. all of the other stock parts. The more direct comparison would be to either a stock 2500K or the 2500, as that is ~$210 in most online stores. I'd definitely agree that those building a computer for gaming would likely choose the 2500k over the 2500 and then plan on overclocking, but showing the 2500k right off the bat overclocked doesn't fall in line with the rest of the testing procedures and their respective graphs/charts. Showing the 4GHz 2500k on the overclocked charts for that comparison would be more than fair, given that due diligence was taken with it being mentioned on the first page that the 2500k does exceed the $200 price point.

And I do realize that even with all this said, none of the results change in the slightest. The methodology is sound, and that's why I've always enjoyed the articles found here.
 
I was surprised to see that my i5 was listed just one step below an i7 chip in laptop performance like #12 and 13. Saved me $150 with a 17 inch Dell Inspiron for $749 versus $899 with mid level Nvidia discrete graphics processor. Plays COD MW3 without a problem full settings as far as I can tell. I could have saved another $150 by going with a similar AMD laptop but was afraid it would be under powered in the long run.
 
[citation][nom]pfkninenines[/nom] showing the 2500k right off the bat overclocked doesn't fall in line with the rest of the testing procedures and their respective graphs/charts. [/citation]

I don't agree, I think it's great for perspective and is explained as such in the article.

I guess we'll have to agree to disagree on that.
 
This is the most ridiculous review of a $100-200 cpu gaming
review ever--pairing such a cpu with a $550 GPU. In only
2 circumstances would someone interested in gaming consider
a $100-200 cpu for gaming
--either a sub-$700 PC which has a $100-200 GPU, or
--a sub-$700 laptop with no additional GPU.

In the laptop case, the AMD llano A8 shines.
I just bought a $499 A8-3850 Gateway laptop and
it plays Skyrim with no problems--I would challenge
any intel laptop at that price (i3 ?? or even i5) which could do that.
(BTW--I was told that the A8-3850 also plays Crysis
well, at least in medium resolution/effects)
 
[citation][nom]elbert[/nom]What this network card does is off loads latency processing. If you don't have an under used core to process the latency with the dual cores you will see on average of about 10~15 ms increased latency. Extra time your OS has to wait for the 2~3 cores pushing fps to get to latency handling. Many gamers will limit their FPS to decrease their latency. In other words you have 3 ways to get your best latency which are under used core, killer network card, or software limiting FPS. In counter strike source this is achieved by fps_max 100 which frees up the CPU some what for latency handling. These two network cards have a processor for the latency handling.http://www.newegg.com/Product/Prod [...] Tpk=killerhttp://www.newegg.com/Product/Prod [...] 6833342001[/citation]

We tested the Killer NIC and it did absolutely nothing for FPS or latency in online games.

At best, if you're playing an online game while downloading torrents (which is dumb in the first place) it can prioritize gaming traffic. But in our tests it was no more effective than freeware network prioritizing software.
 
[citation][nom]JohnK11[/nom]This is the most ridiculous review of a $100-200 cpu gamingreview ever--pairing such a cpu with a $550 GPU. [/citation]

What's ridiculous is that you didn't read the review, nor do you grasp that the entire purpose of a CPU test is to isolate the CPU as a variable, not introduce other bottlenecks.
 
Honestly this review was a giant eye opener. I had read reviews comparing FX processors with the 2500k, etc, but this is something else. It's great to see solid testing done on the 6 and 4 series, though one was simulated.
There really isn't a point in an AMD build nowadays. I've built exclusively AMD computers for more than 12 years. What a sad day. For anything less than an i5, may as well get a G630 on a biostar z68 and call it a day. $190 usd on newegg for cpu and motherboard. Even the fx 4100 with a good am3+ motherboard is about 205.

/resignation

As I refuse to support intel, I guess this means I'm waiting for pile driver, whenever that may be. (Not that it matters, as my current CPU does as well as FX)
Hopefully Nvidia or ICube or someone will rise to challenge intel later on in the future.
 

keep hope alive.. :sarcastic:
 
I was hoping this article was about a sub $200 gaming computer... now that would interesting.
 
Interesting numbers but useless. It's pretty obvious by now that if you're running a GPU over $400 you should be pairing it with at least i5-2400. The question I want answered is if you're running a 6870 or 6950 or 6770xFire at medium quality settings do the AMD chips provide enough grunt for the GPU? That scenario would be more valuable to the people like me (x4 955 + 5770) this article is assumedly aimed at.
 

FPS test from tomshardware article.
Game Min Avg Max
Crysis (Realtek) 25 31.6 37
Crysis (Killer NIC) 28 34.6 38
2L (Realtek) 66 67 68
2L (Killer NIC) 70 71 72
Ping times
Game Realtek Killer NIC
Crysis 35/71/107 27/53/79
2L 43/87/131 38/67/96
Was there a different tomshardware review of the killer NIC? Over 4 years ago and I had to dig it up to remember the info. I wouldn't blame you for a misquote as I couldn't remember that much about its conclusion.

I would guess not for latency as the tested used a Intel Core 2 Extreme QX9650. With the test there was both under used cores and the card. At that time you would have needed a single or dual core to see any benefit with latency. In 2008 a dual core in many games would have had an under used core. The end result of using a killer NIC with the CPU's having under used cores is
observed slight but measurable differences in CPU utilization
compared to no killer NIC.
http://www.tomsguide.com/us/killer-m1-nic,review-1083-3.html

Even if the killer NIC is trash still using a quad with counter strike source sees no latency benefit using fps_max. Ie its the best latency you can get. A single or dual core tho will see 10~15 ms lower latency using fps_max to limit cpu usage. Its a very well know setting to help free up CPU cycles for latency handling.
http://www.chacha.com/question/how-can-i-reduce-lag-in-counter-strike-source
http://easterncanada.20.forumer.com/a/posts.php?topic=206&start=
 


That doesn't make sense. If that's obvious it's because this review shows it to be so, as I'm not aware of any other CPU comparo that does what we've done here.

What you're saying is that you *assumed* a more expensive CPU might be necessary, but before this review you probably *assumed* that a GPU over $400 should be paired with a 2500K or better...

You probably also *assumed* that the i3-2100 wouldn't game faster than the Phenom II X4 955 or FX-8120.

Useless? Nope. Quite illuminating, actually. :)
 
[citation][nom]jesh4622[/nom]What a sad day. For anything less than an i5, may as well get a G630 on a biostar z68 and call it a day. $190 usd on newegg for cpu and motherboard. Even the fx 4100 with a good am3+ motherboard is about 205./resignationAs I refuse to support intel, I guess this means I'm waiting for pile driver, whenever that may be. (Not that it matters, as my current CPU does as well as FX)Hopefully Nvidia or ICube or someone will rise to challenge intel later on in the future.[/citation]

The G630 really only is competitive with AMD in low threaded apps though, which most present games are. The lack of HT means that it falls behind in multithreading by a decent margin.

http://www.anandtech.com/bench/Product/406?vs=106

So it really comes down to your needs. I don't think a HTless dual core is really a future friendly option unless you really plan on dropping a better CPU in there sometime in the near future or your ONLY concern for that machine is gaming.

That's really what it comes down to though. AMD is really only losing at the dedicated CPU + GPU market. If you're gaming, they really don't have a good product for you unless it's a certain budget. How much of the consumer market uses a good discrete GPU though? Their APUs are a solid offering for a lot of average use people that need enough GPU to stream 1080p video and enough CPU to transcode videos and that's about it. The issue there is they're offering a good product for average consumers that are more familiar with the Intel name, and not offering a product for the enthusiast market knowledgeable enough to know who they are and what they offer.
 
I don't understand the APUs' on die GPU, it's juuuust not enough for low level gaming, but over the top for everything else... (blu-ray, whatever). The only thing it might improve is stuff like video and photo editing, but still...

And for the record I know this stuff because I own an A8-3850. The GPU on it is crap. Skyrim ALL lowest settings, 1280*720 gets you around 30fps AVERAGE. My girlfriend thinks it's ok (I think she just doesn't want to say anything bad about the computer I just built her), but for anyone that has EVER gamed before, 30fps is unacceptable.

$0.02
 
Status
Not open for further replies.