Best Gaming CPUs For The Money: January 2012 (Archive)

Page 47 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


The reason they stopped mentioning the Celerons and Pentium Dual cores is that they want everyone to have at least a 4 core / 4 thread processor for at least a slight bit of future proofing. Those Pentium G's and Celerons are already lacking in a lot of modern games, and its only going to get worse. However if you're the type of person that plans on upgrading your cpu down the line and just need something to run on for now, then they'll be fine. Then you could just throw it in to a small build for someone that doesn't need a lot of power.
 


Fair enough, I've overclocked mine just for kicks, and honestly the push to 4.4 was pretty easy, I didnt really need to give it much more voltage and the temps are pretty much where they were before. Going above 4.4 I feel like is going to get to the point where it's not worth it because of heat and power concerns. I compared it with the stock 3.5 settings and you do notice a nice little boost in performance, but overall its not like it made a night and day difference in general usage. I will say that running multiple VM's, with like 25 firefox tabs open, and gaming feels a lot nicer with the 8320 then it did with my old 1100t, and I can tell the difference between 3.5 and 4.4 more readily when those conditions are applied. Overall though, definitely one of the best 100 dollar purchases i've ever made.
 


I also picked one up at MC for $99. Mine is just bumped to stock FX 8350 speeds. I have no need for more on that system.
 
Toms hardware will you please add core i5 4440 in the list !! its just not there !!!!!!but this particular model i saw today at microcenter !!!!! SO WHY its not listed here in this list .??????
 


Not really as close to the i5s in performance as many think. The i5s are much better in gaming and most tasks except those that are highly threaded. Plus with the i5 you can upgrade to an i7 with the FX-8350 your already maxed out unless your motherboard is one that will handle the FX-9000 models.
 
Not sure if anyone looked at the CPU Gaming benchmarks, but they were all ran at low settings and low res......not real world performance. This is some serious Intel bias because it only looks at CPU, and not GPU. Not a real good comparison......and then you run it on 2 of 3 CPU intensive games (SC2 and Skyrim). SERIOUS Intel Bias.
 


Who the hell buys these parts and games at low res (were talking 800x600 for the tests) and low settings? NO ONE! We need real world performance at real world settings using the same GPU across the board for REAL comparison. This article is null and void. Bad experimentation.
 

A *CPU* benchmark would be worthless as a *CPU* benchmark if the results end up practically identical across the board due to being almost entirely GPU-bound.

A component-centric benchmark (such as a *CPU* bechmark which is exclusively focusing on *CPU* performance) has to minimize the influence of other factors as much as possible and the easiest way to reduce the GPU's influence as an unknown and potential bottleneck is to lower resolution and disable GPU-intensive options. This is a basic principle of scientific testing: eliminate as many potential variables and unknowns as possible.

If you want benchmarks under more typical gaming conditions, look at GPU reviews. If you really want a sense of how much influence the CPU has on GPU performance under more typical resolutions and details, you can be more specific and hunt down CPU-GPU scaling benchmarks.

The benchmark is not invalid: it does represent the theoretical frame rate the CPU might be able to achieve if it had infinite GPU processing power with vsync off. You are simply looking at the wrong benchmark for the wrong reasons.
 

microcenter prices (along with limited duration sales, promotions, rebates etc) are not counted in the recommendations because mc is accessible to select areas, not everywhere. as for it's standing, just refer to the gaming cpu hierarchy chart. as for promos, iirc there's a different article/column with various hardware and items are posted.
 


I will give you the GPU scaling, although if you look at game settings compared across the board with only CPUs differing, you still get different levels of performance outside of standard error (~3-5%). Also, some processors have been shown to handle higher resolutions better than others. Basically, is my CPU having any affect on my performance with this setup at this resolution that MOST people play at (usually 1080p according to Steam numbers). AMD 8350 had very little dropoff going from 1080p to 1440p compared to the 3770k in a review by Tek Syndicate with a Dual GPU setup, which might account for some CPU bottlenecking. The theoretical performance with unlimited GPU is still moot because we can never have unlimited GPU performance. Games will definitely favor using a CPU over a GPU or the other way around (like most games), and they only used 3 games. Two of those games are very CPU bound (SC2 and Skyrim), which will automatically favor Intel........Popular games of course, but it still cannot be said as useful as it has no basis in the real world.

Many videos from Linus and Logan show the 3570k and 3770k sometimes beating and sometimes being beat by the AMD 8350 in gaming. There is a difference when you keep settings the same and change ONLY the CPU. The difference becomes lopsided favoring the AMD when you do streaming.........and it cost less than both of those.........i got mine for 150 over Black Friday sale (what a friggin deal, right?). Synthetics and low res/bench marks have little to no basis in the real world and misleading consumers to buy products they actually do not need. Buy 4960X, get 10 more FPS in Crysis 3 than the FX 8350........justify the 5x more cost.

What good is the experimentation if it is not applicable to the real world? Not very.......i work in the scientific research community and i see a lot wrong with the design and interpretation of these results.
 


Kaveri isn't a gaming chip. For the price they are going for, they shouldn't even be considered for one. A 750k and 7750 would give more overall gaming performance than a Kaveri.
 
The comments aren't "broken," the systems underlying them have certain limitations. I believe those limitations are being investigated with an eye toward finding ways to clear out old comments. From what I understand, it is not as simple as it sounds, as there are issues of referential integrity that prevent a simple reset.
I also agree that Kaveri does not look like a good gaming CPU, but I do believe it should be on the chart, if only for reference. Someone may have one and wonder what to expect from it.
 


I think you are still not seeing the point of the three charts here. This article is NOT a benchmark article. As such, TH is not going to include 10 pages of gaming benchmarks and explanations. This article is intended (as they so clearly point out on the first page) as a quick hit option for GAMERS in the market for a new CPU to get some idea of what the best CPU(s) in their price range are. The minimal charts are included simply as a reference point to give some idea of how each step up or down will compare with the previous. They help give a better idea of what your "bang-for-buck" ratio is at each level.



Now, as far as using lower resolutions to test processors in gaming, it is not flawed methodology. Just as any chain is only as strong as its weakest link, so is every gaming computer only as fast as its slowest component. There is almost no cause in a dedicated gaming computer (no other uses) to pump $400+ into a CPU and only $100 into a GPU. Your GPU is going to be your obvious bottleneck. Likewise, if you want to maximize performance, you don't want to spend $450 on a GPU and $50 on a CPU. The low performance of that CPU will bottleneck the computer and cut into your overall gaming performance. All of this I'm assuming you probably already know. Cutting the graphics settings down on any game eliminates the GPU from the equation as a potential bottleneck and is absolutely necessary when trying to answer the question of when each CPU will become the limiting factor in a gaming computer.

Stick the most powerful GPU setup possible in a computer and start turning up the resolution and graphical eye candy and you will undoubtedly start to hit some graphical ceilings. In doing this you are also putting stress on the motherboard, RAM, and other various components. So you may still see performance differences between CPUs, but those differences are also likely being affected by the other components. No, in order to determine the capabilities of each CPU you must design a scenario, even if it isn't a "real world" scenario, in which you absolutely know that the CPU will become the limiting factor.
 

it's not as bad as people think it is. kaveri i.e. steamroller cores' multicore perf is higher than pd
http://media.bestofmicro.com/0/T/418637/original/per-clock.png
single core perf isn't as high. so multicore friendly games might see positive gain in cpu bound situations with a gaming discreet gfx card. even though stock settings might not show such gains, o.c. will... as long as kaveri can o.c. high enough.. like 4.3-4.6 ghz stable, with a reasonably priced hsf. toms used itx platforms for both intel and amd test rigs in the launch review, btw.
i'd like to see how dual cores perform in kaveri/sr, in configs such as 1m/2c and 2m/2c, to observe how older and multicore unfriendly games perform.

the worst thing about current a10 kaveris is the price amd set. the base price needs to drop to $140-150 level so that retail prices can drop. at that point, the apus are reasonable buys imo. as for igpu-off cpus, kaveri-based athlons should sell for less than $110-130 - that's the fx6300 price level. if amd !@#$s around and launches athlons too late in the cycle, like it did with trinity/richland athlons, they won't be worth buying.
 


Excuse me for the typo, I meant to say 4790, the Haswell refresh chip...
 

That's a per-clock comparison. The problem is that steamroller runs at lower clocks than piledriver, so overall performance doesn't improve much.
 


And of course i3 is a gaming chip. Or A6-5500K. OR CELERONS. Yeah...

And last I checked, Kaveri's marketing includes games.



Thank you for the answer, but I have to say that when all I see is one year old comments, something is wrong. And that's what I call broken. It wouldn't be that broken if there was actual pages in place, not that silly but oh-so-modern "show more" abomination. Why are the comments for all these monthly articles mashed all into one mess anyway? I don't see any sense in it. I'm sure it's been mentioned in this massive thread somewhere, but searching for it sounds pretty unpleasant.
And yes, I do wonder what to expect from the CPU side of it alone, though I imagine the hopefully-near-future features like HSA or Mantle will make it difficult to position it somewhere in the chart, unless those features are completely disregarded for that purpose, which is something I would understand.
 

There are no "all these monthly articles", there's just one article that gets updated month after month.
 


i3 is included in the best gaming CPU for the money list every month. Kaveri is too overpriced to even consider. I wouldn't consider any APU for a gaming rig, period. Celeron wouldn't be a consideration either. I would spend the extra $20 on the Pentium G for more cache before I bought a celeron. I could probably put together an old E8400 rig for less and get the same performance as I would a celeron. :lol: If I wanted that pitiful of performance, I would just get my overclocked E8190 back from my mother's rig and give her the E2160 I have lying around. You can get better performance from a Pentium G and a dedicated GPU or a 750k and dedicated GPU for a similar cost or less than the cost of a 7850k.
 
Ehm.. wheres all the AMD cpu ? all I can see is the really low cost cpu from AMD... compared to highend Intel... cmon guys -.-
 
Because AMD cannot compete on the high end. FX 8320 is a nice chip for the $$$ vs a locked i5 if you are willing to overclock. I have an FX 8320 in one system already. It has been fine so far, even though it is really just a WoW rig at this point. Gotta love those Microcenter specials. 😀 If I had to start from scratch, FX 8320 and a 990fxa-ud3 would probably be my picks.
 
Status
Not open for further replies.