Picking A Sub-$200 Gaming CPU: FX, An APU, Or A Pentium?

Page 11 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
hope someday tom could have a best entry/mid/high level gaming system for the money. an overall system review with price range on it. i think that would be more of a help to decide, specially for us begginers, thanx.
 
[citation][nom]jheb0y[/nom]hope someday tom could have a best entry/mid/high level gaming system for the money. an overall system review with price range on it. i think that would be more of a help to decide, specially for us begginers, thanx.[/citation]

Tom's ahs the System Builder Marathons for a low budget, high budget, and very high budget machine four tiems a year or something like that.
 
These tests are worthless nonsense because they used slow RAM that keeps the new cheap Sandy-Bridge (G860, and G630) parts looking competitive with the AMD parts, when the Intel chips only support 1333 MHz and the AMD parts support 1866MHz. Frankly, this article reads like character assassination of AMD particularly when AMD CPUs under $200 were not benchmarked and the test hardware setup was tweaked to obviously favor the new Intel Sandy-Bridge chips.

Obviously, I'm speaking of the FX-4170 which appears to destroy the cheap Intel parts according to the gaming CPU hierarchy chart, but is conspicuously absent from objective benchmark tests in articles published since the release of the FX-4170.

Get with the program Tom.
 
These tests are worthless nonsense because they used slow RAM that keeps the new cheap Sandy-Bridge (G860, and G630) parts looking competitive with the AMD parts, when the Intel chips only support 1333 MHz and the AMD parts support 1866MHz. Frankly, this article reads like character assassination of AMD particularly when AMD CPUs under $200 were not benchmarked and the test hardware setup was tweaked to obviously favor the new Intel Sandy-Bridge chips.

Obviously, I'm speaking of the FX-4170 which appears to destroy the cheap Intel parts according to the gaming CPU hierarchy chart, but is conspicuously absent from objective benchmark tests in articles published since the release of the FX-4170.

Get with the program Tom.
 
[citation][nom]hanskey[/nom]These tests are worthless nonsense because they used slow RAM that keeps the new cheap Sandy-Bridge (G860, and G630) parts looking competitive with the AMD parts, when the Intel chips only support 1333 MHz and the AMD parts support 1866MHz. Frankly, this article reads like character assassination of AMD particularly when AMD CPUs under $200 were not benchmarked and the test hardware setup was tweaked to obviously favor the new Intel Sandy-Bridge chips. Obviously, I'm speaking of the FX-4170 which appears to destroy the cheap Intel parts according to the gaming CPU hierarchy chart, but is conspicuously absent from objective benchmark tests in articles published since the release of the FX-4170.Get with the program Tom.[/citation]

There are no FX-4170 benchmarks because it is just a higher clock 4100 which has already been benchmarked even when overclocked to 4.2GHz (the 4170's clock frequency) and up to 4.5 and 4.6GHz.

Having faster RAM would have made the AMD system far more expensive, so the Intel would have gotten a similar upgrade within the new budget and won by an even greater margin because for the money, faster RAM doesn't help an APU as much as a fgoing form the 6670 to the 6750 would have helped the Intel system. Sorry, but you are completely wrong. Besides that, how is it Tom's fault that AMD builds their CPUs with such crappy memory controllers that they need faster memory jsut to match an Intel memory controller with slower RAM? Intel's controller has about 20-30% more bandwidth than the AMD controller does at the same memory clock frequency. It is also lower latency.

Frankly, your comment seems like a character assassination of Toms. Just because AMD can't win on fair grounds and still doesn't win on unfair grounds does not mean that you get to troll the forums to vent your frustration. That's the problem with ignorant people... They don't know they're stupid and then go on to insult more experienced and knowledgeable people because they have differing opinions, even when the insulted party has the proof right in front of them to show you how wrong you are.
 
Whatever. You completely missed the real reason I was wrong.

Personally, I felt like something was wrong with the idea that Tom's would test with anything, but the best memory all processors in the test could utilize without some kind of disclaimer, so I went and reread and did some more research. Turns out that I was wrong on the memory speed when I penned that original post; I had misread which Hz memory was used in the test benchmark and was also sorely misinformed about what RAM speeds are supported by the Sandy Bridge parts. So there you have it. I was wrong and you missed this glaring error.

I still think it is asinine to not update this article with actual tests of the FX-4170 part considering that Tom's ranks it as the best gaming CPU AMD offers in the Gaming CPU heirarchy charts and I really don't care if you disagree. It is an annoying omission that makes the site look lazy at best, and is out of character for the best hardware review site there is. Normally on Tom's Hardware, objective tests are used to formulate recommendations, not a bundle of assumptions and I'm not the only person who thinks the FX-4170 should be benchmarked on this site either.
 
[citation][nom]hanskey[/nom]Whatever. You completely missed the real reason I was wrong.Personally, I felt like something was wrong with the idea that Tom's would test with anything, but the best memory all processors in the test could utilize without some kind of disclaimer, so I went and reread and did some more research. Turns out that I was wrong on the memory speed when I penned that original post; I had misread which Hz memory was used in the test benchmark and was also sorely misinformed about what RAM speeds are supported by the Sandy Bridge parts. So there you have it. I was wrong and you missed this glaring error.I still think it is asinine to not update this article with actual tests of the FX-4170 part considering that Tom's ranks it as the best gaming CPU AMD offers in the Gaming CPU heirarchy charts and I really don't care if you disagree. It is an annoying omission that makes the site look lazy at best, and is out of character for the best hardware review site there is. Normally on Tom's Hardware, objective tests are used to formulate recommendations, not a bundle of assumptions and I'm not the only person who thinks the FX-4170 should be benchmarked on this site either.[/citation]

In the H67, Sandy Bridge only supports up to 1333MHz. Going above it even on the platforms (P67, Z68) that support it give minimalistic gains in performance outside of a few very specific applications (rendering, archiving, and some folding are the only things that come to mind that make truly good use out of it). Whether or not the SB CPUs support more than 1333MHz doesn't matter too much.

The 4170 is exactly identical to the 4100 in all but voltage and clock frequency. It isn't even binned better than the 4100. Benchmarking the 4170 would be exactly the same as benchmarking the 4100 at 4.2GHz w/ 4.3GHz turbo. It's not an assumption, it is backed by what other sites have done. Tom's benchmarking it would have been redundant because we already know for a fact that the 4100 and 4170 are the exact same, jsut the 4170 is a higher clocked version with the same binning.

No, I didn't care to check if you were right or wrong about what RAM frequencies were used and correct you on the supported memory clock frequencies. I do know that on some P67/Z68 boards, the i5s and i7s can get 2400MHz clock frequencies with the best memory and some people have faster, even if only for record breaking.
 
[citation][nom]blazorthon[/nom]In the H67, Sandy Bridge only supports up to 1333MHz. Going above it even on the platforms (P67, Z68) that support it give minimalistic gains in performance outside of a few very specific applications (rendering, archiving, and some folding are the only things that come to mind that make truly good use out of it). Whether or not the SB CPUs support more than 1333MHz doesn't matter too much.[/citation]
Are you really defending the fact that Intel get no performance gains from faster RAM and then trying to claim that is actually a good thing??

Seems like something is very wrong with the processor if it can't squeeze more performance with better RAM, unlike all previous processor generations, and I'm shocked that anyone could consider that a feature and not a bug.
 
[citation][nom]jezus53[/nom]I completely agree. I've been wanting to build a somewhat light gaming machine based on these APUs but I haven't really found anyone that tests them all as they are. Instead they throw in a discrete card and scream intel is better. Though that is true with discrete graphics, I want to know how it does with the GPU on die because I know the APUs will destroy the intel CPUs when it comes to all around performance based on integrated graphics.But I do still like this article, it was very well done.[/citation]
The A8 runs ME3 at 720p flawlessly, which I find acceptable on my laptop, even when I plug it into the 57 inch TV.

I'd like to see that testing as well, since pre-built PC systems and laptops are dominated by integrated graphics. Presumably Tom's readership is mostly for DIY enthusiasts, so we probably won't see such an article, because the results are a forgone conclusion:

AMD wins on integrated graphics, no ifs, ands, or buts. Intel can't come close to competing.

That fact also doesn't fit the "AMD doesn't have the fastest chips in the world (again) so it will die!!" meme that seems to dominate enthusiast thinking these days ... so yeah, I predict that Tom's will never perform that test, although I'd love to see the results.
 
Let me also point out, that of the 6 games that were tested for this article, ONLY DiRT 3 and Battlefield 3 don't use the Havok physics engine that Intel owns. 😱

lo and behold: those games show the least performance advantage for Intel in the benchmarks and the recommended G630 and G860 don't look good anymore, aside from the fact that, unless you get a DIY system you'll never see a G860 or G630 paired with a decent graphics card. 😱 😱

2/3 of the tested games being Havok engine, undermines the validity of these results, but YYMV, because maybe a user prefers Havok titles, in which case "buy Intel" still makes sense. However, if all I want to play is B3 and D3 (and maybe other non-Havok titles?), then I would get better performance/$ with several AMD solutions and not the recommended buys of the G630 or the G860, which only have shockingly good performance in Havok games.

Might explain why my A8 laptop runs Mass Effect perfectly, but wouldn't it be nice to see that game actually tested, since it is selling like mad and everyone seems to want it?
 
[citation][nom]blazorthon[/nom]There are no FX-4170 benchmarks because it is just a higher clock 4100 which has already been benchmarked even when overclocked to 4.2GHz (the 4170's clock frequency) and up to 4.5 and 4.6GHz.[/citation]
You have stated this BS many times in the comments, but you are wrong and I think we're all sick of you defending testing laziness. The results of testing the FX-4170 don't match the overclocked tests for the FX-4100, so stop claiming the tests are equivalent, when they are clearly not.

Sometimes the overclocked FX-4100 outperforms the FX-4170, and other times it is the reverse, but the overclocked FX-4100 is not an accurate predictor of FX-4170 performance at all.

Just compare the benchmarks found here: http://www.pcgameshardware.de/aid,870241/Alle-Bulldozer-CPUs-im-Test-Inklusive-FX-8150-FX-8120-FX-6200-FX-6100-FX-4170-und-FX-4100/FX-4100/Test/ to the FX-4100 overclocked test found on Tom's, if you doubt this fact.
 

This matches what I've seen and points out that laptop makers simply don't add GPUs to budget CPUs.

Honestly, the Llano is the first AMD CPU I've seen get popular on laptops at mainstream stores like Target, Wal-Mart and Best Buy. Almost every manufacturer is producing AMD based builds, and almost any consumer can actually find them, unlike basically all previous AMD mobile boxes.

Frankly, AMD looks a hell of a lot healthier now that they no longer make the worlds fastest processors, because I can actually find AMD builds in mainstream consumer stores and that simply wasn't true previously.
 

I was looking at the ordering of the processors within each test relative to each other in the same games.

Yes, they use a different hardware config, but since the CPU order changes from the overclocked FX-4100 to the FX4170 tests and not always in the same direction, it indicates that an overclocked FX-4100 is not a good predictor of FX-4170 performance.

None of the FX-stuff looks really magical to me either, because I think that AMDs performance issues are primarily due to lack of sufficient memory controllers (AMD FX flagship CPUs have 1/2 the number of memory controllers of Intel i7 flagships), which the FX line does not resolve, and the small performance hit (I mean surprisingly tiny) from having half the number of FP compute units.

1. If you want a DIY desktop right now and primarily want Havok games, then Intel is clearly the way to go at all price levels.

2. If you want a DIY desktop and don't care about Havok games, then you get better performance/dollar with the FX chips, than the new G860 and G630 Cpus.

3. If you aren't interested in over-paying for components, then you go for pre-built system and tack on a graphics card and maybe some RAM if you really low-ball it, since the bottom are built with 4GB. In this situation I think it is basically a wash, depending entirely on the exact systems available, because now prices can become rather unpredictable, and store specific traffic can give unreal performance/price because of clearance, etc. sales for slow moving systems. Sometimes the AMD will be the better buy, sometimes the Intel will be the better buy.

4. If you want a laptop and you want to game on it for cheap, then AMD is your only option.

I was a # 4 most recently and I'm very happy with my A8, and I've done # 3 a long time ago with AMD, back before the Phenom II and in the heyday of the HD4850.

If I did a DIY build right now, I'd probably do FX-4100, because most games I like aren't Havok engine games, and on non-Havok games, the FX-series has the best performance/dollar, with the FX-4100 leading the way, and that's probably what got the FX-4100 the honorable mention in the best gaming cpus recommendation article.
 

I gotcha. I really like my A8, but I'm afraid to overclock on my laptop, cause I can't afford to replace it!! :)

Right now I feel like it makes sense that the A8 nearly matches FX-4100 performance, since FX-4100 has half the FP resources, but an extra 1Ghz clock and an A8 is basically just a shrunken Phenom II with the on-die GPU tied to it. To me Bulldozer has much more room for improvement because it is a brand new and fairly revolutionary architecture, whereas the APUs simply tack a GPU onto AMD's most robust and mature CPU line, the Phenom II. To be honest I'm amazed that the Bulldozer architecture performs so well, considering it possesses fewer FP resources when compared to Phenom II chips, but that's probably thanks to the higher Hz. Perhaps AMD might remedy this by keeping the architecture using a shared pool of FP resources, but boost the number of actual FP computational units to the number found in the Phenom II line. I bet that if they do that, then the FX chips will not need that extra Ghz, and maybe that's what we'll see when Trinity and Piledriver parts come out. There's still plenty of room for future performance bumps in both lines, and both could be made much better with improved memory bandwidth and latency given by better and more memory controllers (by which I mean 1 memory controller per logical CPU).

Personally, I think that AMD will get the most bang for it's buck in improving the Bulldozer line by: (1) first improving memory performance with more controllers, then (2) refining the controllers for more efficiency and throughput/latency improvements and finally (3) by addressing the loss of FP computational units from Phenom II/APU to Buldozer/Piledriver. Who knows what they'll actually do, but I think that would be the fastest and most cost effective way to again get competitive with Intel for the top of the charts. If Trinity and future APU successors add memory controllers and on-die GDDR5 RAM, then Intel will never be able to catch up to AMD in the laptop market, the largest consumer CPU market.

Let's agree to disagree on whether a overclocked FX-4100 is equivalent to a FX-4170, because the real reason that they aren't publishing any tests on the FX-4170 is that AMD won't give them a chip to benchmark. That's what the article author wrote many, many posts ago, and I just saw his post. I disagree that they are equivalent and I'm 100% sure that if Tom's had a FX-4170, they'd benchmark it at stock speeds and overclocked, because normally Tom's is much more thorough than this and the author of the article indicated as much in his post above. Tom's hands are tied on this one, because AMD doesn't want to fuel the fire of stupid, while they are making nice in-roads into the laptop market, because that trend could be undermined by more lackluster test results being published.
 


Intel doesn't need faster memory. You have a seriously lacking understanding of CPUs. If faster memory improves performance, that means that there is a bottleneck. If Intel doesn't get an advantage from faster RAM, then that means that Intel CPUs are not bottlenecked by memory performance. That means that they make FAR more efficient use of their memory. Tom's even tested this. Intel gets twenty-five to thirty-five percent higher bandwdith than AMD does at the same memory frequency. That AMD needs faster memory because of it's crap memory controller just to not be memory bottlenecked is the bug.

Intel not needing expensive memory to eliminate memory bottlenecks? I consider that a feature. It saves me money on memory.
 


As far as I can tell, there were no tests that the overclocked 4100 from Tom's review did that the 4170 did in the other review and the hardware was different, so you completely failed to provide links that actually prove you right. If I missed something in those links, then tell me what it was, but I'm damn sure that I didn't, unless it was something in the PC review that just happened to not even be in English and Google Translate's version wasn't too great. Next time, please stick with English links if you want to argue on an English forum.
 


You have no ida about whether or not Intel can compete in gaming graphics simply because they chose not to. If Intel wanted to, then they would. In fact, Intel thought about doing something in the high end graphics, but decided that the profits weren't great enough and converted their graphics processor (Larrabee) into a GPU style compute board called Knights Corner or something like that.

AMD win's simply because Intel knows that it's pointless to compete in those markets. Intel's IGPs are good enough for the vast majority of non-gamers and that is where the money is. Look at the difference between AMD, Nvidia, and Intel... Nvidia is a fairly diverse company with products in a variety of markets, yet they aren't even a tenth of the size of Intel. AMD is about two-thirds of the size of Nvidia despite being the only competitor for Nvidia and Intel in the vast majority of X86 computers. Intel knows where the money is and goes for those markets. Dos this mean that AMD gets a chance to win in the low end systems? Yes it does. Llano is obviously the only way to have a low end gaming laptop at a half decent price. However, for desktops, Llano A8s and A4s are not worth the money compared to other solutions. You can get an Athlon II x4 and a Radeon 6670 for the same price as an A8, or an A6 and a 6450 or 6570 in CF with the A6 for the same price as an A8, or a Phenom II x3 and a 6670 for the same price as an A8, or a Pentium and 6670 for the same price as the A8s.

As for AMD dying? There isn't much of a chance of that happening any time soon and only idiots would think that. Even if AMD somehow fails so miserably that no one buys them anymore, then Intel will have little choice but to help AMD because without AMD, Intel is going to be attacked by anti-trust lawsuits. Intel would not let AMD die unless Intel knows that there will be a company to replace AMD as Intel's primary competitor. Considering the fact that AMD's X86 license is non-transferable, Intel would probably need to license another company personally.

Also, that the A8 can run at 720p in ME3 is nothing special. I can guarantee that it isn't near maximum quality settings at 720p in BF3 or Metro 2033.
 


FX does not have the best performance per dollar. The only CPUs within FX that have any good gaming performance, per dollar are the 4100 and 4170. All of the others have horrible gaming performance for the money because they increase performance by increasing core count and core count does not help gaming beyond four cores (which is really not even twice as good as two cores even in quad threaded games due to the unbalanced loading of cores in quad threaded games causing one or two heavy threads and the rest being light threads). The six core and eight core FXs are crap compared to Intel CPUs with similar prices even if you compare overclocked FX to stock Intel performance in gaming.

Overpaying for components? Pre-builts are pretty much always more expensive than building your own and are almost always inferior in one or more ways. As for the sometimes AMD is better in the low end and sometimes Intel is better, yes, that is absolutely true. If one goes on sale and the other doesn't, then the one on sale is the winner at that time. However, Intel's lower power usage should also be considered in the price consideration. After doing the math, it usually takes only two to three years for the price difference to be made up with both the Intel and AMD system at stock prices. Overclocking the AMD so that it will keep up with the Intel CPUss in gaming performance (or getting the 4170) increases power usage so much that the price difference is made up several times over in that same time period. According to your German link, the 4170 uses far more than it's 125w TDP at load... You might not have known this, but the i3 uses less than it's 65w TDP even at load. So, it more or less matches the FX quad cores (depending on the game it either wins or loses, but rarely losing by too much) at about one third of the power usage for the 4170 and about seven-twelfths the power usage of the 4100.
 

Ok. :pt1cable:

I guess you missed this, but I'm just going to re-post for your convenience:

Let's agree to disagree on whether a overclocked FX-4100 is equivalent to a FX-4170, because the real reason that they aren't publishing any tests on the FX-4170 is that AMD won't give them a chip to benchmark. That's what the article author wrote many, many posts ago, and I just saw his post. I disagree that they are equivalent and I'm 100% sure that if Tom's had a FX-4170, they'd benchmark it at stock speeds and overclocked, because normally Tom's is much more thorough than this and the author of the article indicated as much in his post above. Tom's hands are tied on this one, because AMD doesn't want to fuel the fire of stupid, while they are making nice in-roads into the laptop market, because that trend could be undermined by more lackluster test results being published.
 

I've been continuously pricing out builds from Newegg and pre-built systems for the last several months, because I want a new desktop, but I can't afford one. In my experience, adding a graphics card to a pre-built is usually a good deal cheaper than getting parts on Newegg and DIY-ing it, but this is not something that is easy to detyermine because prices for everything is always changing!!
 

:heink: Way to miss the point!!
 

This is just pure speculation on your part. Could they leap in and compete with NVidia and AMD in Graphics? Maybe, maybe not. That probably would depend on how many GPU designers/programmers they could steal away from NVidia and AMD and lot's of money and willpower.


I think I've said as much in my other posts, so this we agree on.


We agree here too, but FYI, I was addressing that to others in the forums who seem to believe that this is the death-knell for AMD. We agree that Llano, Bulldozer, while disappointing, don't mean the end of AMD and I think you nicely state the case here.

It is special when the Intel option at the same price can't even do that, which was the point I made about it.
 


Intel Pentium G620 plus Radeon 6670. It gets about 50% higher frame rates than the A8s do (according to Tom's). There are viable Intel entry lever gaming solutions. The G620 plus 6670 is also usually slightly cheaper than the A8s and uses far less power, saving even more money over time. It's only sacrifice is in moderately/highly threaded performance, however, at this low of a gaming level, that does not effect gaming performance. It only reduces the G620's productivity performance.

Intel's Larrabee is extremely different from AMD's and Nvidia's technology. It would not be easy for them to move over to working for Intel. Intel didn't need them anyway when it made Larrabee and then Knight's Corner and it's family of products. It's not speculation that Intel can improve just like any other manufacturer. Intel has proven that they can with Knight's Corner and it's family of products, as well as with their CPU divisions and more.

Otherwise, yes, we seem to agree on the rest of this right now.
 


Okay, sure, sometimes a bare-bones OEM machine upgraded with custom hardware can be cheaper than a purely DIY build. However, consider what you sacrifice in doing this... If you use the OEM motherboard, then you know that you have a crap motherboard that won't do any overclocking. You know that you have a crap PSU. You know that if you use the OEM's RAM, it is probably generic crap. Buying a partial OEM and upgrading it with custom hardware might sometimes be cheaper, but is it really worth it? I suppose that it's a grey area.
 
Status
Not open for further replies.