Best Gaming CPUs For The Money: January 2012 (Archive)

Page 54 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
The hierarchy chart refers to the CPU performance of the processors not to the combined CPU+igpu performance. So even if the A6 is listed there it will be listed according to its capabilities as a gaming CPU as when combined with a discrete GPU. Still I wouldn't recommend any 2 core/2 threads chip for even an entry level gaming rig especially AMD chips. You can get away with a dual core Pentium if you mainly play older games and MMOs that are still using ancient engines.
 

Because they keep updating the old article instead of posting a new one with its own fresh thread on the forum. That's why the forum thread has 28 pages of posts and still growing.
 
Best gaming cpus for the money is a Used Xeon X3460 (rebadged i7 860) that can be bought for as low as 79 dollars and pair it with a refurbed Asus P7P55DE motherboard and for 160 dollars you have a killer gaming combo.
 


You are also limited to two USB 3.0 and only PCIe 2.0... I'll stick with the Haswell system I just built.


I don't know that I will ever understand recommending a new build with outdated technology. Ivybridge I can still understand using in a build, heck, I may disagree with it but I can also understand using Sandybridge because you want to be able to say you have a higher clock rate. But why would you want to install a 5 year old processor (the X3460 was released in 2009) in a new system?
 
How many USB 3.0 devices are out there? I don't really know of many. PCI-E 2.0 is still plenty for 99.99999% of users. For those on a strict budget, that Xeon combo isn't horrible. It would be better than anything you could get new for that price. You would be looking at an FX 6300 and 760g motherboard or an i3 haswell with a B85 board for that price range.
 

External HDDs and USB3 thumb-drives would be by far the most common USB3 devices out there but even then, most people rarely need to have more than one or two of those plugged in at a time.
 


That is kinda what I was thinking too. People make a big deal out of features that you hardly use. The PCI-E 3.0 vs 2.0 is the most common one though. Even a Titan Black wouldn't fully saturate a 2.0 x16 slot.
 


I'll give you that PCIe 2.0 doesn't actually limit gaming performance, but the principal is still there.

Look, if someone told you they had a system they wanted to sell with a Xeon X3460, Asus P7P55DE, 4 GB of DDR3 1333 RAM, and an nVidia 9800 GT then how much would you tell them they could get for it? You are looking at parts with 5+ years of wear and tear on them.

Personally I would rather not build with 5 year old used parts on a dated platform. I wouldn't be able to trust it to last me more than a year or two, if that.

That said, I prefer to build my systems with the idea that they will last me well over 5 years before any major upgrades are necessary (new GPU would probably be the most expensive upgrade I would consider). I suppose there are people who prefer a different upgrade cycle, and no that processor isn't likely to bottleneck your gaming FPS. So it's simply not a move I would personally ever advise someone to make.
 


I would probably take what is essentially an i7 860 over an i3 4130 or an FX 6300, if that is all I could afford. I tend to do some kind of upgrade every couple years or so. My last upgrade was GPU, since my 3570k is still more than enough for my needs. Still kicking around the idea of a 1230v3 mini-itx rig though.
 

I'm of the same philosophy. Yes, some old-ish CPUs still have power for today ( and likely for the next year, maybe two, ) but I build machines hoping to get five years use out of them too. Building a "new" system every 2-3 years from used parts doesn't seem like it will save you much money, if any, in the long run vs building a new budget machine every 4-5 years.

But even if the used parts had enough compute power to last that long, will they physically last that long? You run the risk of getting parts and components that have been used and abused. GPUs that have been pressed too hard for mining, CPUs that have been unsafely OC'd. If I'm not getting it from a reliable source, I'd rather not play that lottery.
 
I did not buy this for a new rig, i bought it for a cool test gaming rig to see what i can use of older tech thats cheap now and make a good gaming system. My main rig is still my i5 2500k paired with a GTX 660 i built over a year and a half ago , i just like to tinker with older tech that has potiential to stay relevant. BTW with that combo i was able to get that X3460 ocd to 4GHZ and it flew and almost kept up with my 2500k which i also have a 4GHZ . I just enjoy getting the most out of old tech , ive yet to buy a used cpu that doesn't work for a while.
 
A really good advantage to a 39xxK or X or 49xxK or X is you can stream your game content and other video/audio seamlessly from the same PC (highly CPU dependent) at high quality / bitrate and native res. Not sure if this works as well with fewer cores on multi-threaded games such as multi-player BF4, but would be curious to know?
 
I really dont understand the final analysis of the most inexpensive CPUs. It claims a significant performance boost from the 750k to the FX6300 (5.75% avg increase based on the graph for a 50% price increase or $40) but a less of a boost from the 6300 to the 4130 (7.25% avg increase for 4% price increase or $5). So a .14% avg gain per dollar vs a 1.45% avg gain per dollar. Am I missing something?
 
If you have a Microcenter by you that i5-4670K ($200 and up) falls into the $125-$200 segment since they discount their cpu's so much. Plus they usually have mobo combos that are decent.
 
The Athlon X4 750K just came off E-Blast at $70. I couldn't resist.

The big price premium on the A10-5700 is due to it's 386 'Turks' (?) Radeon cores and 65w label. It pummels 'older' titles even without dual-graphics.

A10-5700_load_01.png


At full load the CPU rumbles along at less than 30w occasionally peaking in the low 40s. The A10-5700[strike]K [/strike] is one of the few AMD chips where the price went up over time :lol:

Edit: Not unlocked.

 

If you consider it so hopeless, it's a wonder you waste any of your time on their site.

Your 8350 murders other CPUs in what way? In synthetic benchmarks? In multi-threaded productivity and creation apps? Notice this list is called Best GAMING CPUs for the Money? The 8350 only wins when the majority of its integer cores are engaged, relatively few games go over two threads right now ( let alone the four a 4670 can handle. ) Clock for clock, Intel has AMD beat pretty badly right now, so no game is going to run notably better on a 8350 compared to the 4670, both of which are ~$200.
 
From page 6:
"Budget-oriented gamers should pay attention to the significant performance increase available when you step up from the $80 Athlon X4 750K to the $120 FX-6300. You get less of a boost moving from that FX to Intel's $125 Core i3-4130, though."

Unless I'm missing something, this just isn't true. Going from the 750k to the FX-6300 only grants significant gains for Far Cry. The average increase across all 4 games is 5.75, with a cost increase of $40. Going from the FX-6300 to the i3-4130 results in an average increase of 7.25 for a cost increase of only $5. Going from a FX-6300 to an i3-4130 offers greater gains than going from a 750K to an FX-6300, both in terms of absolute performance increase as well as performance increase relative to price increase.

Edit: D'oh, jmrainwa already said this, just a handful of posts before mine.
 
Status
Not open for further replies.