Best Gaming CPUs For The Money: January 2012 (Archive)

Page 27 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

InvalidError

Titan
Moderator

Except stock AMD CPUs are nowhere near Intel stock performance in moderately threaded applications and once you overclock AMD CPUs to match, AMD's power efficiency drops off the cliff.

As for idle, I repeated multiple times that the $100+/year saving is mostly for people who DON'T IDLE much. In my case, once all my junk is loaded, my CPU rarely gets a chance to drop below 50% load so it spends most of its cores spend their limited idle time in C1 state with the CPU clock pegged at 3.4GHz.
 

vertexx

Honorable
Apr 2, 2013
747
1
11,060


Stock 6300 beats I3 in Gaming, which is what this article is about. Problem is, this article needs to update its gaming selection to newer titles. If you look here (http://www.tomshardware.com/reviews/piledriver-k10-cpu-overclocking,3584.html) and here (http://www.tomshardware.com/reviews/ivy-bridge-wolfdale-yorkfield-comparison,3487.html), the stock 6350 in that article beat the I3 in most titles.

$100/year would be for people who operate at full throttle 24x7/365, or who live on an island in the Caribbean or South Pacific. Probably the exception rather than the rule. But hey, for those folks, sure, the I3 is better.
 

InvalidError

Titan
Moderator

My point was that power savings by going Intel instead of AMD for people who make their computers do something most of the time makes it possible to afford an i5-3470 or better once power costs over time are accounted for - or conversely, whatever is saved up-front by going AMD will be paid to the power company over time.
 

1991ATServerTower

Distinguished
May 6, 2013
141
4
18,715
Seriously Tom's... is the FX-4130 really a Vishera part? I can't find any official specs from AMD on the matter and given it's 125W TDP, compared to the FX-4300, it stands to reason that the FX-4130 is a Zambia processor...

Wish AMD had an "Arc" website like Intel...
 
oO
Am I the only one that sees this list as outdated?

First FX4300 is 5$ more than 4130 which obviously makes it better choice.
From 1st page: "The FX-4300 is down $10 to $110, making it an interesting alternative to the $105 FX-4130."

Second. Few people would buy an outdated platform. I understand a honorable mention to 3570K cause it overclocks better than 4670K but it also needs around 300Mhz more in order to compete with the 4670K newer architecture (example: TPU's comparison a 3570K at 4,5Ghz is equal to a 4670K at 4,2Ghz in x264 HD Benchmark). But anyway its a "K" version so mentioning it for o/c reasons seems legit. What I don't understand is the Intel Core i5-3350P while you can get the i5 4430 for $10 more (and has the benefits of haswell architecture).

Third. 3930K looses at the majority of benchmarks from 4770K at gaming. Apart that doesn't deserve a honrable mention, it costs $230 more oO. At least you can make a honorable mention with the 4930K which is better architecture and offers more performance at the same price ($569 vs $579).
Again this is coming from the first page:
"Consequently, we're giving the Core i7-4930K an honorable mention in the top spot, replacing the Core i7-3930K."
So what's going on here? oO
 

rd1

Distinguished
Dec 10, 2008
2
0
18,510
bit of a typo on 1st page mid section:

"we have a $70 Pentium G3220 (3 GHz), an $85 Pentium G3420 (3.2 GHz), and the $100 Pentium G3430 (3.3 GHz). They're all 54 W parts with HD Graphics on-die, which gamers won't care about."

then next part says:
" The only real news bit is that the 3.3 GHz Pentium >>>G3220<<< is Intel's fastest Pentium ever,"
 

vertexx

Honorable
Apr 2, 2013
747
1
11,060

Answer is here: http://www.tomshardware.com/reviews/ivy-bridge-wolfdale-yorkfield-comparison,3487-15.html#comments

Skyrim is notoriously poorly threaded, so the budget pentiums do well. Of course, the test setup runs a 7970.

Right now I have two budget setups in my house that we use to play a heavily modded skyrim. One is an I3/GTX660 all stock combo in an ITX HTPC/Gaming setup. The other is an AMD Phenom II x4 965 @4.0 Ghz with a 7850 OC'd to 1050 Mhz. I paid $75 for the Phenom, $190 for the 7850, $130 for the I3, and $220 for the 660. I also paid $99 for the ITX H77 board and only $55 for the cheapest 125W uATX AMD board I could find. On both setups, I run the same 120+ mod-pack with high res, realism, ENB, etc. Game settings are 1080p/Ultra settings with a bunch of INI tweaks. They both have essentially the same gaming experience. With ENB on, I get frame rates in the 40's for both. With ENB off, I get frame rates in the 60's. I'm happy with both. The AMD setup is fun to tweak to get optimum performance. The I3/660 setup sits in the media center and just does its thing.
 

No, and the same mistake was pointed out last month. This is pretty embarrassing really. Singling it out for praise based on the advantages of the Piledriver architecture, while in fact it's crummy old Bulldozer.
 

nokiddingboss

Honorable
Feb 5, 2013
671
0
11,160


thats great for your uncle andrew. but are you sure his power consumptions are low though? :D
i also heard laura makes $1000 dollars a month doing blows. or was it actually you? spambots - gotta love his uncle drew.
 
So I'm a bit confused about the hierarchy chart. Are we saying that the FX 8350 is at the same performance level of the FX 4350? And yes, I understand we are talking about relative gaming performance only.

Here's my two cents, why don't we drop the last few tiers of five year old CPUs (and other irrelevant processors) and focus on making those upper tiers a bit more granular. If the lowest recommendation in the article for gaming is the Athlon X4 750K, then do we really care what's happening 9 tiers below that?

 
I'm inclined to agree with buzznut. There is too much compression at the top of the charts. With some games now better threaded, the "one-size-fits-all" ranking may not work as well as it used to. I'd suggest a color coding. Put the CPU in red if it loses a tier in particularly well-threaded games, and in green if it gains a tier.
 


part of the problem is the difference in performance from a phii x4 965be to a i5-3570k isn't all that much. When you take into consideration 90%+ of gamers are on 60hz monitors the difference in performance between the two vanishes to almost nothing, as the extra 5-40fps vanishes into irrelevancy behind the low refresh-rate of their monitors. I've said this before on this topic, but it's really hard to classify the different experiences between high end chips when the end user CANT tell what cpu he's using.

When THG put the list together they made a curious claim, and that claim was that cpus within 3 teirs of each-other would not feel at that different from the end user experience. frankly THG specifically said most of those users won't feel the price of updating their cpu will be worth it if the new cpu is within 3 teirs of the old one. so how are they supposed to uncompress the charts when basically most of the high end cpus put out by both companies within the last 3 years are basically indistinguishable from each other as far as the end user is concerned?
 

InvalidError

Titan
Moderator

As far as benchmarks are concerned, there really hasn't been that much of an improvement on mainstream CPUs at least on Intel's side: only a ~20% bump from Sandy to Haswell. For most people, this is genuinely indistinguishable for day-to-day computing and you also see tons of Sandy enthusiasts deciding to skip Ivy and Haswell due to insignificant benefits.

Pretty hard to spread out the distribution when performance near the top of the mainstream segment has been effectively stagnant for so long.
 
Although that's a valid point, in GW2 I can tell a difference between a 3.8GHz Phenom II 970BE and a stock i5-3570K; the latter is considerably smoother, and there is only a one-tier difference (because the 970 is oc'ed to past 980 stock speed). Maximum FPS seems dependent on the graphics card, but the minimum seems more reliant on the CPU.
 


guildwar 2 is notoriously single threaded though. sure, that game will feel different, particularly in the massive world battles, there are examples of games out there, here and there that will feel different. that said had you never played on the i5 i doubt you would have felt the loss. the game would run and be playable and look good to your eyes i'm sure. there will be games and resolution sthat will push cpus to their limits, and in those situations the better cpu might possibly feel a little better to game on.
 

mohit9206

Distinguished
I do not agree with Toms budget cpu recommendation under $100.Pentium dual cores must still be recommended over the Athlons because the fact remains aside a handful of games that take advantage of more than 2 cores most are still either single or dual threaded.Until and unless atleast 50% of all games take advantage of more than 2 cores Pentiums are still better for gaming than cheap Athlons.
 

I do like this idea. But Ingtar has a point, that any good CPU from the past three years will feel almost indistinguishable in most games. Still, I would like to see a little separation at the top, provided it's clear that anything in the top X tiers are considered more than good enough for any AAA gaming. Maybe something like top three tiers are mainly for bragging rights or specialty computing, but anything in the top five tiers will serve up 40+fps in any game ( provided it's paired with an appropriate GPU. ) Tier six is adequate for casual/budget gaming, tier seven and below are hitting EoL.



Though I'm not so negative on dual-cores as CMI, I wouldn't recommend one to a gamer as a new purchase. If you're purchasing a new CPU right now, chances are you want it to last at least two years, maybe three or four. A dual-core will handle most games right now just fine, but I doubt it will last that full three years before it starts to get bogged down.
 

mohit9206

Distinguished


I know its not 2010 but even today only few games take advantage of more than 2 cores.When majority of games will take advantages of 4 cores or more then a quad should be recommended in budget gaming cpu's.Athlons at stock are inferior to pentiums and overclocked they consume twice as much power so the heat and power consumption tradeoff is not worth extra little performance.This is only for cpu's under $80 or so.
 

thehat2k5

Distinguished
Dec 27, 2007
3
0
18,510
How much did Intel pay you for this article? AS IF...AMD has a few chips that belong in the top slot. I"ve seen the benchmarks on this site even. WTF?
 

InvalidError

Titan
Moderator

The benefits of having four cores (or 2 cores + HT) start showing off much earlier than by the time applications/games start using all four threads/cores since the extra simultaneous execution contexts do enable computers with more cores/threads to start running threads in parallel earlier than they would if they had to be queued between fewer execution resources, which would make an i3 feel more responsive than the Pentium even if the Pentium is only at 70% load. More threads/cores also means fewer context switches which also helps to make things more efficient such as allowing background processes to execute without kicking foreground processes off their CPU.

The advantages of increased scheduling concurrency at the application and OS layers may be subtle but they are there. If you restrict a game to one core (HT or not) you will likely notice tons of background/OS activity still landing on other cores. I just tried that on my PC by setting affinity on my active programs to core #3 (about 97% user-time and 3% kernel-time) and I end up with 10% kernel-time activity on 0-2 which have less than 1% user-time load. So, that's ~30% of a single core's worth of kernel-time that gets offloaded from overloaded core #3 due to the OS scheduling everything it can to other cores to free up as much time as possible user-land processes on #3. This effectively makes my simulated single-threaded programs ~30% faster than they would be without extra cores.
 

vertexx

Honorable
Apr 2, 2013
747
1
11,060


Oh boy, another one... Take a look through the above - lots of links showing the advantages of more cores in today's gaming. If you want to play older games, the AMD CPUs may not be as fast as the Intel in single-threaded performance, but it will be more than good enough - I mean once you hit >60fps in a budget setup, how much faster do you need to go? Take a look at this: http://www.tomshardware.com/reviews/piledriver-k10-cpu-overclocking,3584-19.html - it's a much more comprehensive review of the CPUs. There is also a sister article for the Intel CPUs if you want to compare side-by-sides for each game (http://www.tomshardware.com/reviews/ivy-bridge-wolfdale-yorkfield-comparison,3487.html)

There is also plenty of discussion above on CPU power consumption differences being very moot for gaming - not even going to go there again.

Bottom line, the budget quad and 6 core AMD CPUs are now better for gaming than the budget Intel CPUs. Single threaded performance is more than playable, and where you really need it with newer titles, the AMD CPUs deliver better performance in a budget setup.
 
Pretty sad seeing wacko amd fanboys trying to pretend amd are so much faster than any intel....LOL!!!!!!!!!
For a LOW-END cpu, there is a handful of amd cpus that are the best 'bang for the buck', but as the list clearly shows, it's intel cpus that dominate. It's just how it is and always will be. AMD already admitted...THEY WILL NOT COMPETE against intel.
The only thing that can help amd now, if game makers bother writing for 64-bit multi-cpus. But since they typically write for consoles, then port to pc, that's not exactly high on the list to do.
 
Status
Not open for further replies.