Best Gaming CPUs For The Money: January 2012 (Archive)

Page 15 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Just like every month for the last 3 years, amd's benchmarks are all over the shop depending on the game and only look good if you could even call it that on gpu bound games that give the general public the idea that the cpu is good when really you want a cpu that does every game well, And that CPU Hierarchy Chart is the biggest load of crap I have ever seen. How about a single core Hierarchy Chart and a multiple core Hierarchy Chart. That way people could see the real truth and not just a whole lot of 'oh but' excuse's that toms must be getting paid for to post up such outright lies. Ps whoe redid your website needs to be fired or a written warning cause its garbage now.
 
Keep in mind that no one tries to present these charts as a be-all, end-all regarding CPU performance. From a certain point of view, you are absolutely correct in that they are flawed. Exceptions and outliers are the rule. But, in general, the chart serves as a decent guideline. A wise builder considers the specific purposes for which a machine will be built, and considers a variety of detailed sources of performance data. A "common" hobbyist, or someone attempting a DIY-PC for cost savings, personal satisfaction, or most any reason other than competitive benchmarking, can use this chart and the accompanying article, and be pretty well assured of building a machine that doesn't suck.
 


you are missing the point about the chart completely. this isn't a "ranking" of the best cpus as you bench them. this is a ranking of the best cpus as you use them. As in the rankings are deturmaned by a very simple question "can you the user tell the difference in performance without a benchmarking test?"

the point of the "tiers" is those are the separations where you as the end user could tell the difference between chip performance. In short if you made a generic black box pc, with the cpu hidden, you wouldn't be able to detect any difference in performance from all the chips on the same "level". Every single AMD on the 2nd teir might bench slower then every single intel on the same level in every bench and every test... but what Tom's is saying is the end user won't be able to tell them apart because the performance is TOO CLOSE for a human to detect.

In fact, Tom's goes further then that... and claims any chip within 3 levels of another will be so close in performance (from the end user's perspective) that it wouldn't even be worth the $$ to upgrade to the higher level chip. That you'd need to be 4 levels higher to have a big enough difference in performance to make it feel like an upgrade worthy of the cash spent.



+1

this poster gets it
 


If you are not looking to overclock yourself the lower power draw on the 6300 would be a good thing vs the extra few Mhz
 
@JustPosting101 Every article you see or read, That's between Intel or AMD, Would always be Bias against AMD, That's all the Fanboys on Intels side, The only thing we can do is support AMD as much as we can and hope they can stay in the game till they release the new 28NM Steamroller, That's when We pwn on Intel till then, We would be losing every fight, Im surprised they didn't put the FX8350 on here, Cuz they know that's the best CPU there is and they ignore it as if it doesn't exists!!!!!!!
 
I hate to say something bad about Tom's, because I love this site. But...this article's format is outdated and no longer usable. You guys (Tom's) need to update your approach.
Let me explain.
The Intel chips have a small advantage at low thread counts due to a more efficient core design. This advantage is exaggerated when benchmarked at low resolutions such as 800x600, 1024x768 or even 1280x720. And it is further improved in game titles where Intel partnered with the devalopers or invested heavily in the underlying engine/technology.
But as soon as you start throwing 4+ threads at a 1080p resolution at them, the AMD chips have the advantage. And more and more games are starting to use 4, 6 and even all 8 threads available on the AMD CPUs.
So I'm hoping you guys will look to updating this article format to include thread counts or at least a low and high thread categories. Because if you play Skyrim on an i3 and then decide you want to try Crysis 3 you'll be wondering where your performance went.
 

Resolution doesn't change things, except to make (some) games GPU-limited. Which would be stupid when you're testing the limits of CPUs.

You're right about Crysis 3 vs. Skyrim and multithreading though. I'm not sure splitting out performance on "low threaded" and "high threaded" would work all that well... but it's at least one way that the differences between AMD and Intel could be made more obvious than they are in the current system. It was particularly jarring when they switched from recommending Pentiums to Athlons and Phenoms. The Pentiums are still better in less threaded games, while the Athlons and Phenoms (quad core ones) were better in more threaded games all along.

Meh. Having a single list of best CPUs for the money is never going to be an exact science.
 


I think the 6350 should replace the i3 as well. With recent price drops, that was mentioned on TGH before this latest best of came out, the PhII is at about what the Athlon II X4 640 was listed as. List should probably be PhII 965, FX4350, FX6350 with the 8320 getting at least an honorable mention due to its overclocking capability vs the i5 3350p that is more locked down.
 

Definitely! We cannot forget that the 4350 and 6350 have more cache also.
 
Looks like my original comment did not post.
Basically, a 1-size fits all kind of chart/approach is flawed and doesn't accurately represent the CPU/Gaming landscape at present.
Let me explain:
The Intel offerings will do very well in games with low thread counts thanks to their more efficient single core design. This gap is exaggerated at low resolutions such as 800x600, 1024x768 and even 720p. But the AMD 6 and 8 core offerings will run with, and sometimes even beat, the i5 and i7 at 1080p in heavily threaded games, and for less money.
So the 'best CPU for the money in gaming' really boils down to the titles you play and whether or not they can make full use of all the cores available. If you're using 2-4 cores, get an i3 or i5. But for titles that can use 6+ cores, get a 6300 or 8300.
I've been reading Tom's for nearly a decade now and really enjoy the site and reviews. But I honestly feel this article category needs to present more diversified data to the reader in order to be as accurate and useful as possible.
 
Since now AMD and Intel have many graphics processors inside their CPUs (APUs), it would be good to create a new topic like "Best Gaming APUs For The Money: May 2013".
Mean benchmark using only APUs (without external GPU).
Thanks!
 

That's actually pretty cool news. Too bad for AMD that Haswell is right around the corner to drown out that little gem.
 

clock-for-clock bench is useful for academic purposes. if anything, it might reveal how 965be performs better than the other two except in aes-ni benches.

[strike]too little too late, amd.[/strike] better late than never, i guess. if chosen, it'd be interesting seeing how paul fits this 100w cpu in a $500 sbm build.
i thought that amd dropped prices on some cpus, but newegg shows no sign of that (fx8350 is $200, up from $180/190). i was pretty sure that ph ii x4 965be and fx4300 would be cheaper now.
 

one 7790 per build sounds good. if it was me, i'd rather wait for amd to release the frame pacing driver first.
 
Hi Don,
the Gaming CPU Performance Per Dollar page and graph have gaming performance and dollar amounts, but never actually calculate Performance Per Dollar.

Using the average performance % vs price on the legend, 5/26:

Athlon II X4 640 0.8

Phenom II X4 965 .75

AMD FX-4300 120 0.63

Intel Core I3-3220 .62

intel core i5-3550P .51

Intel Core i5 3570K .44

Intel Core i7 3930K .18

The performance to price ratio does *not* scale evenly; for that to happen the ratio would have to remain the same at each pricepoint. Instead, even excepting the 3930K the ratio falls drastically 0.8 to 0.44 - not scaling well at all. The fall off is linear except for the 3930K which would have to be priced at ~$300 instead of $570 to stay on the line.
 
there's seems to a typo in your CPU hierarchy.....there are no two versions of the FX-6350 and it is no different from the FX-6300 except for the overclock and wattage of the TDP. Therefore they are the same. Bad Call.
 
It's funny how people reads these bias articles and gets drawn into subject but forgot that tom's hardware don't care about people's opinions. Instead, wants to you to argue amongst one another. If you are really smart, look for other articles reviews to see if they can relate to Tom. Furthermore, you are wasting time making comments and gets bash by people who already fell in the trap of a bad article.
 


I like what you have done just wish toms could have done it for the whole cpu gaming lineup. Greatjob!
 
Status
Not open for further replies.