Best Gaming CPUs For The Money: January 2012 (Archive)

Page 28 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


We've said hyperthreading was useless in gaming usually when we're talking about why an i5 is generally a smarter purchase than an i7, since both already have four physical cores. Being able to run 8 threads usually isn't consequential in gaming. But an i3 only has 2 physical cores, so hyperthreading allowing it to run 4 threads, which is pretty important these days. Its not quite the benefit from a true quad-core like an i5, of course.
 
Yet the quad Phenom/Athlon/APU CPUs trounce the Pentium/Celerons in media encoding and look how many millions of people are making gaming videos on youtube with their low end machines. The quads are the smart, over all, long term choice for new computer buyers who are on a serious budget. They'll get more out of it for their dollar.
 


Very embarrassing indeed. This is just a copy/paste of last month. It would be nice if the person responsible for this list would actually read the comments and at the very least fix errors such as this.
 
750K is Trinity... 760K is Richland. 400 Mhz higher stock base frequency, improved turbo core, and i'd bet a higher ceiling on affordable air cooling (sub $30 HSF). That's worth $5 in my book.

If you're overclocking with very high end air cooling or watercooling, then you're probably not using these chips to start with. But for a budget gaming build with some light overclocking on a budget? I guarantee you 760K is the better choice for the extra $5.
 
you can get a 4770k processor for 279.99 at micro center stores that's a much better option than both the 3rd gen intel processors at the prices they listed for the money.
 

But for a large chunk of people out there who have no microcenter anywhere near them, driving there since the discount is in-store-only would cost them over $50 in gas.
 


50$ in gas? i think not, but your right micro centers are not everywhere you could have a family member or friend buy it and ship it and it would still be better bang for your buck than the other intel i7 processor mentioned. i mentioned it more for others who do have a micro center near them which is probably many and they could save some money and get a better processor than the ivy bridge i7 mentioned.
 


Yes, $50 in gas. Probably more for me, actually, because the closest Micro Center to my location is roughly a 5 hour drive. 10 hours to get there and back? Easily a $50 fill up.
 

Trinity and Richland are practically identical though. And stock clocks are kinda meaningless for overclockable processors. Unless they're binned differently, any clocks you get with one CPU are achievable with the other as well.
 
I don't see how it's a consequence that the Core i5 3350p ducks in under 70 watts! Also the TDP of the Core i5-3350p is 69 watts! Not 77 watts!
 
Why do AMD chips have so high TDP compared to Intel's offerings? What are they not doing right? I mean, how can an Intel chip have higher performance at less than half the wattage?
 
By any chance would you be able to add Intel Xeons (which are compatible with socket 1155 and other desktop motherboards) since they might pack a punch. (4 core with HT for the price of a regular Core i5)
 


there is something really wrong with this post, and i'm not sure if it's because you're trolling or simply don't understand what TPD is. TPD = Thermal Design Power. The TDP of intel cpus != the TDP of AMD cpus... both companies measure this differently. AMD for example likes nice round numbers, so they basically round to the nearest best guess with their TPD, as a result all AMD desktop chips regardless of actual TDP as intel would measure it) or even actual Wattage pretty much get rated as either a 65W cpu, a 95W cpu or a 125W cpu.

second, TDP != actual power draw of the part. TDP is a function of the average heat energy that needs to be dissipated from the cpu in order to maintain standard function. Its basically a rating for CPU COOLERS. it's not an accurate representation of actual Wattage used by the cpu. AMD likes to round up and tends to have higher TDP because their cpus need to run COOLER then intel cpus. in short they need to get rid of heat more efficiently in order to maintain standard operation (they also measure heat in a different place then intel, that said, even if they measured heat on the cpu in the same place their chips would still need to run cooler then an intel, so it's a moot point).

Now, that said, your basic point, despite the poor word choice has a nugget of truth. AMD cpus generally don't have low power idle states which work as efficiently as an intel cpu. so when the cpu is doing next to nothing, an intel will scale down to lower power draw then an AMD. on the other side of things, when measured core for core, at the same ghz, under heavy load the intel core i cpus actually draw on average of almost 20% more power then an AMD piledriver. the piledriver design is actually pretty efficient in the grand scheme of things... that said, if you want to measure it as a function of IPC instead of per core, the intel will once again be the more efficient cpu...
 

All the charts I can find show Intel using ~20% LESS total system power than systems using similarly clocked AMD chips with the same number of cores while delivering substantially higher performance. Once you start overclocking AMD chips to close the performance gap with stock Intel performance, AMD's power consumption blows through the roof.
 


gotta compare apples to apples. an i3 won't compare to ANY amd offering. you need to compare a fx4300 to an intel i5, k series, it's about the only 4 core vs 4 core comparison you'll find. At the same ghz generally the i5 will chew up almost 20% more power then the similarly clocked fx at matching ghz... of course this comparison requires turbos to be turned off and both set to a similar "not really overclocked, not really downclocked" ghz rating... like 3.5 ghz. you can see the truth of this even more clearly if you bump the comparison up to an fx 8350 vs an i7 or i5 (doesn't matter which) the fx 8350 will chew up more power, however when taken as a function of # of cores, you'll see the 8350 will eat up less power per core then the i5/i7.

that said, its not really a fair comparison, if you were to divide the Wattage being used by the cpu by the total IPC of the cpu instead of by the number of cores, you'll see the intel are slightly more efficient then the AMD. So in the end for the total amount of work being done intel are in fact slightly more efficient.
 
For the actual work being done, Intel is FAR more efficient. There's nothing slight about it.

wh.png
 

I'm not talking about i3. Even the i7-3770k OVERCLOCKED typically uses less power than high-end AMD quads at stock clocks.
 
The Anandtech Vishera review found that in x264 video encoding, a system with an FX-4300 drew 135.8W on average, while a Core i7-3770K system drew 119.8W on average. The FX-6300 etc. drew more than the 4300 of course, and the Core i5s and i3 less than the 3770K.
 
No, that's total energy. You can't have total power; it's like saying total miles per hour.

The only reasonable comparisons IMO are an i5-4670K vs FX-4300, and possibly some stuff with Trinity/Richland. You can't really compare the 8 core FXs to a SB-E Xeon E5, because one's quite a bit out of date.
 
The heirarchy chart is that big so that people with older CPUs have a point of reference.

But it has some real big issues, for example the A8 and A10 series APUs perform about the same as Intel's older Core 2 Quads (6000 and 8000 ones), but are shown higher.

Another thing, two newly launched games that i've played are TW: Rome II and Arma 3, both are multi-threaded but the implementation isn't proper, so my Q8400 is a gigantic bottleneck, because one core is under tremendous load.

In this case, a hyper-threaded i3 would probably do much better than my quad core.

These two games also don't play nice with AMD CPUs, reportedly, and Rome II has a Haswell-only optimization.
 
"The only real news bit is that the 3.3 GHz Pentium G3220 is Intel's fastest Pentium ever, breaking the record set by its Wolfdale-based Pentium E6700 at 3.2 GHz."

This is false. The Pentium E6800 (which I own) is 3.33 GHz.

 
Status
Not open for further replies.