Picking A Sub-$200 Gaming CPU: FX, An APU, Or A Pentium?

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
The PC gaming ecosystem isn't very friendly to AMD CPU, we have console ports that demand excessive per-core performance and many devs still rely on old codes and engines. It's also hard to recommend APU for cheap gaming system. A cheap Pentium with cheap discrete graphic card is better most of time, unless the software landscape changes:

1. Games need to utilize more than one CPU core for rendering. In order to achieve good scaling, PC games have to use more than one rendering thread, so the burden can shift to GPU.

This brings up Skyrim, tests suggest the game only uses two cores, yet their recommended spec says quad cores. Does it seem like they're trying to hide their engine's age or deficiency by deceiving people it's cutting-edge?

2. For on-die GPU, games start to utilize heterogeneous computing, to take advantage of low latency and high bandwidth communication of having CPU and GPU next to each other, so CPU and GPU can assist each other for general processing in games, in addition to discrete GPU for rendering. That way AMD's better GPU in APU can shine, provided that AMD's GPU still have edge over Intel's.
 
[citation][nom]Cleeve[/nom]What use is data from resolutions nobody uses?.[/citation]

I'm pretty sure people use other resolutions, not that I disagree with testing at 1080p.

[citation][nom]Clonazepam[/nom]Now on to llano... they named a chip the "A8-3870K". It looks strikingly similar to "i7-2600K", and at first glance to the uninitiated, it would appear to be more powerful given that it has higher numbers all around.If you ask me, AMD begged to be thrown to the wolves.[/citation]

This is kind of silly to me, the whole "AMD stole the K to fool people" argument. If people aren't smart enough to know the difference between the two, they deserve what they get.

[citation][nom]Clonazepam[/nom]Now as far as comparing llano + discreet card in hybrid xfire, vs. an i3 with a single gpu of the same model, I believe its already been proven the i3 will win. [/citation]

Where was this proven, can you link this? Please don't bother with any tests run before August, Hybrid XFire was poorly supported when Llano first came out and even the non XFired APU + discrete did better. Granted, it still is pretty inconsistent, but has improved since then.

[citation][nom]Clonazepam[/nom] Then, if you take into account the llano can use 1866 RAM speeds, and up to 2000 on certain boards, you are just throwing more money at an already budget system. In addition, you'd have to add a little more RAM to the llano system to compensate if some of that memory should be dedicated to the integrated solution. So now you are adding the fastest possible RAM and more of it to llano, just to try to make it a fair fight with an i3 + single gpu (god forbid you should xfire 2 of them with an i3). Now factoring all of that in, and actually trying to set up a test bench with all of those variables, should prove more tedious than its worth, and the idea should whither on the vine, so to speak. More power to anyone that wants to bother, and throw all those "extra fast" goodies in there, but most buying llano, would pick up cheap ddr3 1600 cas 9 ram, probably 8GB of it, and go to town.I own the A8-3870K, but haven't fired it up yet. Admittedly, I was more excited about it before I read this article, but now I'm well grounded and know exactly what to expect from it with its integrated graphics (based on previous benchmarks, it all ties together nicely now).[/citation]

No one said you had to use 1866Mhz. The time when TH did actually test the difference between speeds, it was most noticeable going to 1600 Mhz. 1866 WILL give slightly better results, but you don't have to use it to get results. The faster memory would also be put to better use using the IGP, mainly if you're OCing since faster memory also makes the graphics memory faster. It really doesn't take a lot of RAM either, usally 512MB, and you could bump it to 1-2GB if you wanted. It's really grasping at straws to fault the Llano for utilizing faster memory.
 

Only 2 games tested, but 6% and 8.5% gain from 1333 and 1866.

http://www.madshrimps.be/articles/article/1000220/AMD-FX-8150-Bulldozer-CPU-Review/4

But if your doing a review on how bad AMD is, why support it with supported speeds if intel sees no benefit
 


Questions like this is why reading the article is a good idea:

"We're including a Core i5-2500K operating at 4 GHz in order to measure to see if these lower-priced models compare favorably to a higher-end overclocked processor. "

Which is pretty self explanatory. Or would you prefer we kept comparison data out of the picture to prevent us from getting an idea how these sub-$200 models compare to high-end overclocked kit? Personally I think it puts everything into perspective, i can only assume it irritates you because you are an AMD fanboi.




This is a CPU comparo, not a memory comparo. Are you suggesting that we should stack the deck and give the AMD CPUs faster memory? Are you also suggesting that a $200 FX-8120 would be able to catch stock $125 i3-2100 gaming performance with even more expensive, faster memory? Because I don't think so.



Oh, I don't know, mostly because the 8120 we have here won't go past 4.2 GHz without crashing. 😉




That's an interesting conclusion, although i think most folks are able to get a lot more out of it than that. But you appear to have some foregone conclusions and product bias before you even started, so open mindedness might not be your forte.
 
Looking back over the article its pretty good. It really show cases the sub $200 gaming CPU's. I would have like to seen a 960t as its the only phenom II CPU newegg.com has ATM. Seeing how it stacks up against the 955 would be nice as the turbo at 3.4GHz could be surprising. The Athlon II 631 would have made a good overclock choice due to its low cost and basically the same CPU as the A8-3870k.

Good work Cleeve. I value your PC knowledge more than anyone.
 
[citation][nom]SpadeM[/nom]PS: Also as a side note, my own i5 2400 allows me to "overclock" (31 to 38) with no base clock increase, through a limited multiplier on the UD4 motherboard and so if Asus has this ability which i assume it does, could have posted overclocked numbers for those parts too.[/citation]

IKR?, I have an Asus P8P67M-Pro with an i5 2400 running at 3.9 Ghz (it is actually more than 3.900 but I can't remember the exact decimal) and it is perfectly happy at where it is. I always see the 2500K here on TH being clocked at 4ghz, and i don't think the difference between the two is enough to fret about the K moniker. With all the benefits at the sub-200 price level I wish that TH would show what the 2400 is really capable of, not just stock 3.1Ghz.
 
This article is fine to demonstrate the CPU's. However, using 4GB of RAM is BAD even for the demonstration -- you are losing at minimum 2FPS~6FPS+ an on a 64-bit game a heck of lot more. Not to mention skewing the results. See below.

That said, no one is or should ever match a $550+ HD 7970 with a sub-$200 CPU never mind 4GB of RAM! It would be a CrAzY setup!

I assume Microsoft Windows 7 x6, Service Pack 1 is a typo, and the OS is Windows 7 x64. No, I didn't read the entire article; just quick observations.

4GB vs 8GB vs 16GB (Dual Channel) {GART}:
x64_gamimg.png

ref - http://www.tomshardware.com/reviews/ram-memory-upgrade,2778-8.html
 
[citation][nom]Outlander_04[/nom]This is probably the least useful set of benchmarks ever on Tomshardware . It is not of any use to anyone building a budget pc , which it seems to be is the point of using sub $200 processors .The use of a $470 graphics card with this build negates the entire point of making the comparison . "[/citation]

Your inability to grasp the point of this article does not indicate its usefulness or lack thereof.

The $470 graphics card STRESSES the point of the article, it underlines it and makes it useful. How much CPU do you need? How much makes a difference? Those questions are nullified by adding a GPU bottleneck. It is the essence of the point: does the CPU make a difference when gaming? Clearly it can.

And I don't necessarily agree with you as far as your suggestion that a 6850 would make a huge difference, I think that's wishful thinking. Besides, our goal was playable 'realistic' performance, with a 6850 we'd have simply lowered detail across the board and probably ended up with similar results to what we have here.

We'll have to agree to disagree I suppose. But I think you've missed the point.
 
[citation][nom]jaquith[/nom]This article is fine to demonstrate the CPU's. However, using 4GB of RAM is BAD even for the demonstration -- you are losing at minimum 2FPS~6FPS+ an on a 64-bit game a heck of lot more. Not to mention skewing the results. [/citation]

With the vast majority of folks actually running 32-bit windows with a 3 GB usable limit, I would argue the results would have been skewed with 8 or 12 GB of RAM on a 64-bit OS.
 


The point that should be addressed is that budget builders would never build such a machine . On the systems forum we are regularly asked how to build a gaming computer for $500 , or $600 . This article does nothing to clarify whether an FX or i3 is a better choice .
Seriously .....who uses a $110 processor in a machine with a nearly $500 graphics card?

Perhaps this was a point you were unable to grasp
 
[citation][nom]zepfan_75[/nom]IKR?, I have an Asus P8P67M-Pro with an i5 2400 running at 3.9 Ghz...I always see the 2500K here on TH being clocked at 4ghz, and i don't think the difference between the two is enough to fret about the K moniker. With all the benefits at the sub-200 price level I wish that TH would show what the 2400 is really capable of, not just stock 3.1Ghz.[/citation]

Most enthusiasts with the K series have them running at 4.5GHz and up. At 4GHz the 2500k is just sitting back sipping wine. I've had mine to 5.1GHz on air but backed it down to a more stable and everyday safe 4.83GHz. So yeah, it's worth the extra $$ to enthusiasts like me who like to squeeze every additional frame out of high resolution gaming as the GPU becomes more and more the barrier.
 
Reviewing the monthly CPU hierarchy chart, it looks like most AMD CPUs may need to be moved down a notch, especially if the chart is rating stock performance. Sad, but sometimes the truth hurts.

...but I DO think memory appropriate for each platform should be used, the way it would be in an actual system. If that means AMD CPUs get paired with DDR3-1600 and Intel only gets DDR3-1333, then so be it. I still am skeptical of how much real difference there will be.
Until now, budget builders have not bought $500 GPUs to go with $125 CPUs, but if it makes sense to do so, well then there you have it. CPUs are now so much powerful across the board than they were 2-3 years ago (yes, even AMD), that it may be time to challenge some old notions of balance.
 


As for the 2500k, yes I saw that, point is why even have it there other than to show how superior intel is.

As for the memory, toms has tested it over and over with intel systems that you gain 1%, but I have yet to see anything along the same lines with AMD. Their architectures are massively different, so are the results. Test both systems with 1866 memory, don't cripple AMD just because Intel doesn't benefit, its not AMD's fault that Intel can't utilize faster memory.
 
[citation][nom]noob2222[/nom]As for the 2500k, yes I saw that, point is why even have it there other than to show how superior intel is.As for the memory, toms has tested it over and over with intel systems that you gain 1%, but I have yet to see anything along the same lines with AMD. Their architectures are massively different, so are the results. Test both systems with 1866 memory, don't cripple AMD just because Intel doesn't benefit, its not AMD's fault that Intel can't utilize faster memory.[/citation]

There are two things you can't seem to understand:
1 - Faster memory means higher price. If you have to pay more for faster memory just to get closer to (and not even reach) the competition, it's a bad deal already. There has been tests with 1866 all the way down to 1066, and the difference is huge when using the IGP, but it's marginal when using only the CPU portion of the chip.

Memory beyond 1333mhz only matters if you're really trying to get the most out of the IGP. And I think AMD should even allow triple/quad channel on these APUs, so the igp won't hit a memory bottleneck with 3/4 moduloes of 1333mhz memory.

2- The overclocked i5 2500k is there to show "How far" the game lets your CPU go until you hit a GPU bottleneck.
On games like Metro, there's less difference between the Phenoms and the i5 because the game is bottlenecked on the GPU.
Skyrim and SC 2 (and any massive multiplayer game), on the other hand, clearly demand as much CPU as you can give them. Now, we have the data to show how far you 'could' go with a high-end CPU. It's there to compare prices, see if it's really worth the extra $100 or so.

[citation][nom]Cleeve[/nom]Questions like this is why reading the article is a good idea:"We're including a Core i5-2500K operating at 4 GHz in order to measure to see if these lower-priced models compare favorably to a higher-end overclocked processor. "Which is pretty self explanatory. Or would you prefer we kept comparison data out of the picture to prevent us from getting an idea how these sub-$200 models compare to high-end overclocked kit? Personally I think it puts everything into perspective, i can only assume it irritates you because you are an AMD fanboi. This is a CPU comparo, not a memory comparo. Are you suggesting that we should stack the deck and give the AMD CPUs faster memory? Are you also suggesting that a $200 FX-8120 would be able to catch stock $125 i3-2100 gaming performance with even more expensive, faster memory? Because I don't think s😵h, I don't know, mostly because the 8120 we have here won't go past 4.2 GHz without crashing. That's an interesting conclusion, although i think most folks are able to get a lot more out of it than that. But you appear to have some foregone conclusions and product bias before you even started, so open mindedness might not be your forte.[/citation]

I admire your patience, sir.
 

Seriously, if your spending $1000+ on an entire system, are you going to be extatically happy that you saved $5-10 on cheap crap memory? Put it this way, you saved 1% to sacrifice 5-10% performance. Not a smart investment.
 
The Only problem I really have with Intel right now is the price and features on mother boards compatible with their processors. It is almost impossible to find an Intel board that has 8 sata ports that is under 300USD. And the ones that are, are usually utter rubbish. I don't have that problem on the AMD side :\
 
A lot of the above comments have some good points and some bad points so it's hard to say who's wrong and who's right since so many people are both at the same time on different points they tried to make.

For everyone complaining about using the 7970 in this comparison, it was used to show how good each CPU really is. It allowed benchmarking at 1080p with high settings without creating a GPU bottleneck. Because of this the CPUs were compared, not the whole system so we know that CPUs that failed here aren't worth buying because they will fail with any GPU since they are the bottleneck. It would have been nice to see a budget card or two thrown in to compare the CPUs with realistic cards but there were good reasons for using the 7970.

For people complaining about the i5-2500K being here, it was used to compare the performance of budget CPUs and high end CPUs, to see if the price is worth it. I think the 2500K should have been overclocked at least several hundred MHz more than it was since most buyers won't only put it at 4GHz. A higher clock rate would have provided a better comparison against the high end systems.

For people complaining about the 1333MHz memory used, it doesn't matter if better memory was an option. That would not be used in the budget systems and would thus skew the result a little. The point was to benchmark the CPUs with all else being equal and if changing the memory would not have helped this goal. I do believe that 1600MHz memory would have been reasonable since it is about the same price as 1333MHz, if I were doing the testing I would have used whatever was the cheaper. If 1600MHz was cheaper at the time (sales allow some 1600MHz kits to dip below 1333MHz kits but I haven't seen one in a while on Newegg.com) then I would have prefered 1600MHz but oh well, 1333 shouldn't change the results much, the order should be the same. If one system used 1600MHz then all of them would use 1600MHz. Anything above 1600MHz would be unreasonably expensive and would not help this comparison at all, it may even be detrimental. AMD's CPUs weren't crippled because Intel doesn't need the extra bandwidth, faster RAM wasn't used because it would have been too expensive. Unlike using the expensive 7970, using faster RAM would not provide a better look at the CPU when it's used in a budget system.

As for the RAM capacity, I think 8GB should have been used. Even in budget systems today, 8GB is easy to fit in ($25-$35 for 1333MHz, a little more for 1600MHz) so there is no good reason to use less than 8GB on a dual-channel system. 6GB is reasonable on a triple-channel system but there is no excuse for using 4GB unless these are older parts. Was there a reason for using only 4GB?

For anyone complaining about using 32 bit OS, if you bought the OS from Newegg or another retailer then you can upgrade to 64 bit for free. For everyone else, you can probably switch it out some how unless you are running XP. XP pro x64 isn't that good anyway, pretty poor driver/software support compared to Vista/7 x64.




You have failed to understand the Bulldozer architecture. It is not 2, 3, 4 core parts with hardware hyper-threading there really are 4, 6, 8 integer cores and 4, 6, 8 FPUs. Two cores and two 128 bit FPUs (that can work together for 256 bit FP math) share hardware. Hyper-threading duplicates some hardware for two threads to share a single core and Bulldozer shares hardware between two cores, the concepts are almost polar opposites. AMD's concept isn't why Bulldozer failed as much as their idea to do what Netburst did, increase pipeline length tremendously and do other things to reduce IPC while attempting to increase clock speed enough to make up the difference. The problem lies in that clock speed can't go much higher without decreasing power usage tremendously because AMD's CPUs just use to much power. The poor performance of AMD's SRAM caches really doesn't help them either. They probably use more power at stock than a moderately overclocked Sandy bridge quad core (i5 or i7) but I admit to this being en estimation at best.
 
[citation][nom]noob2222[/nom]Seriously, if your spending $1000+ on an entire system, are you going to be extatically happy that you saved $5-10 on cheap crap memory? Put it this way, you saved 1% to sacrifice 5-10% performance. Not a smart investment.[/citation]

Excellent point.
 
[citation][nom]kancaras[/nom]am i the only one noticing that they are testing 100$cpu + 500$ gpu VS 100$apu + 500$ gpu?[/citation]


Its for eliminating the GPU bottleneck, essential to these tests.
 
[citation][nom]Outlander_04[/nom]This article does nothing to clarify whether an FX or i3 is a better choice.[/citation]

Seems quite clear the i3 is better.

Pretending the benchmark results are irrelevant because we've shifted the bottleneck to the CPU doesn't work for me, but have fun with that. 😉
 
[citation][nom]jtt283[/nom]Reviewing the monthly CPU hierarchy chart, it looks like most AMD CPUs may need to be moved down a notch, especially if the chart is rating stock performance. [/citation]

Absolutely! These results will of course be referenced for changes in next month's update.
 
[citation][nom]noob2222[/nom]As for the 2500k, yes I saw that, point is why even have it there other than to show how superior intel is.[/citation]

No, the point is we're "including a Core i5-2500K operating at 4 GHz in order to measure to see if these lower-priced models compare favorably to a higher-end overclocked processor. " It's there to see if it provides an advantage over sub-$200 models, to show people what the extra money buys them in terms of performance.

Based on your candor I suspect If it turned out all these games were GPU bottlenecked and the FX-4100 performed as fast as the i5-2500K, you'd have been tickled pink we included it. So, don't shoot the messenger.

Inserting your own imagined conspiracist agenda because you'd prefer to bury the plain truth doesn't fly here.
 
[citation][nom]Cleeve[/nom]Seems quite clear the i3 is better.Pretending the benchmark results are irrelevant because we've shifted the bottleneck to the CPU doesn't work for me, but have fun with that.[/citation]
Your testing methodology about as valid as building a computer with a Z68 mb, 2500k processor, 8 gig of RAM and fitting an nVidia GT 220 graphics card , running benchmarks and then declaring that the 2500k is a useless gaming processor .

What would have been useful would have been testing the strengths of the i3 and FX architectures in real computers that someone might actually build .
That would have worked for me
 
Care to overclock the Athlons to show the ponies they can push? The real value in the Athlon II X3 and X4 models were that you could overclock them so much, and you barely had to touch them. Stock cooler X4s were being overclocked over 500mhz.

Especially when comparing the FX series, it'd be nice to show the older models where they count.
 


By your twisted logic we should perform all of our graphics card tests with a Sempron. Once again, we don't want to test the graphics card, we want to avoid the bottleneck. Why bother introducing a bottleneck on purpose? That makes no sense for a CPU test.




Our tests show that you can quite comfortably build a gaming rig with a sub-$200 CPU (i5-2400, for example) and be sure you're getting the most out of a 7970.

That's real, verifiable, and defensible. Your protest has been invalidated.
 
Status
Not open for further replies.