Best Gaming CPUs For The Money: January 2012 (Archive)

Page 26 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


There is no difference (except, possibly deep within your mind...)

At idle (which is where the system will spend the majority of it's time during those tasks), the FX6300 consumes all of 2-3w more than an i3-3240.



 


You are correct there. Though an APU would be cheaper and sufficient for most office work.
 

Depending on usage scenario, the TCO on Intel builds can be lower due to much lower power for a given performance level: for people like me who have some degree of load on their PC practically 24/7, saving 100W by going Intel can save as much as $100/year on power so the ~$100 extra system cost can pay for itself as early as the first year.
 

In actual perception and experience, no there isn't. But if you're going off benchmarks and scores, the i3 is still ahead a little.
 

whoa, don't tell them that!
because -
no one cares about desktop power consumption. so any and all power savings with a desktop pc henceforth shall be disregarded, as decreed by the people(mostly c.a.l.f.).
also, no one knows about varying power tariff in different areas because power cost is nearly free everywhere.

on a sidenote, since everyone lives in an air-conditioned, 20c-24c environment, high temperature due to heat dissipation is utterly a non-issue.

amd cpus have more transistors (2billion!) so they use more power, duhhh. they also have such low idle power use (which is due to absolutely no help from motherboard manufacturer or bios updates or improved vrm/mobo throttling mechanism. it's only amd's own credit.) that their higher average and much higher load power consumption (@stock, let alone o.c.) figures are useless. :pt1cable:

@office pcs: i assume amd apus lose more leverage as they don't support technologies like tpm and txt (that's why amd will incorporate arm's trustzone ip in future apus. that news drove c.a.l.f. into frenzy shouting amd is gonna be fusioning arm apus in 2013).
 
$100 / $0.11 (avg US cost per kw/h) = 909.091 KWs
909.091 KWs = 909,901 watts
909,901 watts / 365 days = 2,490.66 watts per day

(rolling eyes)

At least Onus made a reasonable point on AM3+ ITX mobos --- there are AM3 ITX motherboards but their prices are far from reasonable these days :)

 

Not sure what the "rolling eyes" are about there. Electrical consumption is measured in Watt*Hours (Wh) which is the integral of power used over time counted in hours. 1Wh represents energy equivalent to a 1W (or 1J/s) load operating for a whole hour which is 3.6kJ.

Take your 2.49kWh/day and divide it by 24 hours/day to bring that down to actual watts and you get:

2.49kWh/day * 1day/24h = 104W.
 

Watts per day? That would be a unit of energy consumption acceleration.
 

And then divide it by four or so to get what it'd be for people in places with pricier electricity. In Hawaii it's something like 40 cents per kWh, and similar in parts of Europe.
 
power-1.png


From Xbit Labs

They're both great processors ... I'm not sure why youse guys go all FUD-crazy when a $110 AMD chip threatens your self-esteem.

Dance !!

 

Some of us logically and calmly point out some areas where Intel is superior ( your own cited references prove it. ) You then reply with insults ( because that so inclines others to listen to you, ) and yet we're the "FUD crazy" ones? So anyone bringing up some of Intel's finer points is a stupid dancing monkey? Right.

Pot . . . Kettle. I'm sure you're acquainted.
 

http://www.tomshardware.com/reviews/fx-8350-vishera-review,3328-16.html

I don’t have a chart for each machine’s idle power consumption. Looking over each log, however, tells us that the Core i5 and Core i7 idle the lowest (roughly 79 and 80 W, respectively). The Phenom II X6-based system pulls about 102 W from the wall when it’s not doing anything. And the machine with an FX-8150 draws 92 W. FX-8350 fares no better, idling at 92 W as well.
 
The title of the article references the best gaming CPUs, not the best idle CPUs. What are the power-consumption numbers for a system while playing a demanding game? Please note that number and type of hard drives and fans can easily overcome a few watts' difference, not to mention the graphics card(s) used, and the efficiency of the PSU. While I agree that power consumption is important, let's not lose the forest for the trees...

 
Gamers worried about power consumption difference equal or less than that of a standard incandescent light bulb amuse me. Unless you are on 24/7 gaming you will not notice much difference in power bill. What you set your AC, heat, refrigerator, freezer, hot water heater to, ect will have a far bigger impact than a gaming PC will unless you live in your parent's basement and never leave. If that is the case, you don't pay for the power bill anyway and basements tend to be cold so the extra heat is nice. 😛

Had we talking about F@H type rigs, then I could see how you would want a more efficient rig as they are on 24/7. For a gaming rig, not at all.
 


So what they are banned in some areas? That matters about as much as the price of rice in China to this discussion. I was merely adding a visual comparison to show the silliness some make over power consumption. Your system isn't just a gaming rig if you do F@H, so I could see power consumption mattering in your case. But for 99% of home gaming rigs and everyday PC's, power consumption is a non issue.
 
My system is a gaming rig. I didn't build it to run F@H, I built it to run games (and do some browsing and writing and so on). It just runs F@H when I'm not gaming, because why not?

Just like people could run F@H on their PS3s, back when their hardware was still any use. That didn't make PS3s into "F@H rigs" either.
 
this discussion about power consumption is reaching absurd heights. isn't this about "gaming cpu's" and not "best low power cpu - save some greenbacks by going green and save the earth too or some other bullshit" or did i actually clicked on some hippie website?
 

My point was that with Intel CPUs, people who leave their PCs on and doing something most of the time can have both: save $100/year on power which pays for the more expensive CPU and platform in relatively short order while having some of the best gaming/work/whatever processing power available in the mainstream segments.
 




Where does anyone get $100/year difference in energy consumption costs?

Take these to charts:
http://www.tomshardware.com/reviews/piledriver-k10-cpu-overclocking,3584-17.html
http://www.tomshardware.com/reviews/ivy-bridge-wolfdale-yorkfield-comparison,3487-19.html

The difference in "Active Idle" power consumption between the FX-6350 and I3-3225 based systems is not quite 20 Watts. That's "Active" idle. I'm sure if you leave your system on 24x7, you'll be in various power saving states less than that.

Take the 20 Watt difference and multiply by the total number of hours in a year. That's 20x24x365 = 175,200 Watt-Hours. Electric companies charge by the kWh (that's 1000 watts of power consumed over a period of one hour, or basically like running your microwave for an hour), so divide that by 1,000 to get about 175 kWh total energy consumption differential between the two systems for the year.

Multiply by your power company's rate. This website (http://www.eia.gov/electricity/state/) has the average for the US at not quite 10 cents per kWh. If you are lucky enough to live in CA, then you pay $.15/kWr; if you live in New York, it's $.18 (sucks to be you). If you're lucky enough to live in the great red state of Idaho, you only have to pay $.08. So, for the average of about 10 cents (according to the linked website), you now have $17.50 for the year.

Only at full gaming load for 24hrs/day 365 days per year do you approach $100 energy cost differential. If you are gaming 24 hours per day 365 days per year, then the "well I use my PC for other things besides gaming" argument is moot, and you probably want to fork over the extra $8/month on the much better-for-gaming FX-6300 CPU.

Otherwise, the power consumption argument is for whiners.
 
In large parts of Europe, the electricity price is three to five times higher than the US average. And I don't even want to think about what it must cost in Japan, with all their nuclear power plants shut down.
 


http://en.wikipedia.org/wiki/Electricity_pricing

Average cost of electricity in Japan according to that source is US $.20-.24/kWh, so we're now up to US $35-42 for the year. And that's very conservative. One source linked on this thread has the idle power differential at 2-3 Watts. So that would be US $6-7 difference in Japan. Shiver me timbers.....

EDIT: I just noticed the source of Japan prices is a bit dated (2009) - looking at other current sources - it's actually hard to find the current rate. There are quite a few articles showing they plan to raise rates, because the power companies have been eating the losses, but they don't say by how much.
 
Status
Not open for further replies.