InvalidError :
Wisecracker :
$100 / $0.11 (avg US cost per kw/h) = 909.091 KWs
909.091 KWs = 909,901 watts
909,901 watts / 365 days = 2,490.66 watts per day
(rolling eyes)
Not sure what the "rolling eyes" are about there. Electrical consumption is measured in Watt*Hours (Wh) which is the integral of power used over time counted in hours. 1Wh represents energy equivalent to a 1W (or 1J/s) load operating for a whole hour which is 3.6kJ.
Take your 2.49kWh/day and divide it by 24 hours/day to bring that down to actual watts and you get:
2.49kWh/day * 1day/24h = 104W.
InvalidError :
nokiddingboss :
this discussion about power consumption is reaching absurd heights. isn't this about "gaming cpu's" and not "best low power cpu - save some greenbacks by going green and save the earth too or some other bullshit" or did i actually clicked on some hippie website?
My point was that with Intel CPUs, people who leave their PCs on and doing something most of the time can have both: save $100/year on power which pays for the more expensive CPU and platform in relatively short order while having some of the best gaming/work/whatever processing power available in the mainstream segments.
Where does anyone get $100/year difference in energy consumption costs?
Take these to charts:
http://www.tomshardware.com/reviews/piledriver-k10-cpu-overclocking,3584-17.html
http://www.tomshardware.com/reviews/ivy-bridge-wolfdale-yorkfield-comparison,3487-19.html
The difference in "Active Idle" power consumption between the FX-6350 and I3-3225 based systems is not quite 20 Watts. That's "Active" idle. I'm sure if you leave your system on 24x7, you'll be in various power saving states less than that.
Take the 20 Watt difference and multiply by the total number of hours in a year. That's 20x24x365 = 175,200 Watt-Hours. Electric companies charge by the kWh (that's 1000 watts of power consumed over a period of one hour, or basically like running your microwave for an hour), so divide that by 1,000 to get about 175 kWh total energy consumption differential between the two systems for the year.
Multiply by your power company's rate. This website (http://www.eia.gov/electricity/state/) has the average for the US at not quite 10 cents per kWh. If you are lucky enough to live in CA, then you pay $.15/kWr; if you live in New York, it's $.18 (sucks to be you). If you're lucky enough to live in the great red state of Idaho, you only have to pay $.08. So, for the average of about 10 cents (according to the linked website), you now have $17.50 for the year.
Only at full gaming load for 24hrs/day 365 days per year do you approach $100 energy cost differential. If you are gaming 24 hours per day 365 days per year, then the "well I use my PC for other things besides gaming" argument is moot, and you probably want to fork over the extra $8/month on the much better-for-gaming FX-6300 CPU.
Otherwise, the power consumption argument is for whiners.