How does higher wattage compute to electricity costs?

For typical apps that don't load the CPU very much, less than $1/year. For full load apps, no more than $5/year and probably <$3/year.
 



Thanks...good to know....won't be a factor in my choice then.

 


People aren't concerned about the power consumption due to power bills, they are more concerned with them because the more power it consumes, the hotter it will run. Keep that in mind.
 
Though all of the above posters are absolutely correct, I'd like to add that, in general, higher wattage CPUs do not necessarily translate to higher power consumption. The wattage ratings are not a measure of the actual average energy use for that CPU based upon any realistic model of how people will use their computers. They also fail to take into account (to my knowledge; I believe AMD and Intel derive their wattage ratings in slightly different ways, but in the end I believe they are supposed to be a guideline for cooling more than for power usage) low power use idle settings. For two similar CPUs such as the two you mentioned, I would make a wild guess and say that the actual power draw difference at the wall between the dual and quad core, most of the time, would be relatively low - maybe on the order of 10 watts (very roughly)? At 12 cents per kilowatt hour, and three hours of use per day, my very very rough estimate yields a yearly cost difference of:
(10 Watts) * (3 hours/day) * (365days/year) * (1 Kwatt/1000Watts) * (0.12$/KWattHr) = 1.3$
That is assuming the difference between the two cpus (identical systems, otherwise) is 10 Watts. A reasonable range for the difference would be, say, 1 watt to 40 watts (1 watt being much lower than I'd guess it is and 40 watts being much higher than I'd guess it is, except maybe under max CPU usage). That range means the cost range would between 13 cents and 5.20$. No matter what, the cost doesn't seem to matter.

All that just goes to show how you can do the sort of calculations that accord99 probably did, and that he isn't just pulling those numbers out of his butt.
 
Mattc is getting closer to the truth. The figures listed aren't the power consumption numbers, but the amount of heat energy emitted by the CPU. The difference in power use will also depend on what you use the computer for. If its just surfing or other lowend use, there probably won't be much of a difference at all. With speed step enabled, both chips will stay in their lowend state.

If you are worried about power, pay more attention to your PSU (including efficiency and the output at which it's most efficient), the video card, number of drives/ram sticks, and motherboard. The PSU and video card will have a much bigger effect on power use then CPU choice.
 
A typical pc with around 350watts of power supply, running 12 hours a day will cost about $5 - $10 a month. Depending on the electricity rate on each state. This is U.S. based by the way.
 
The more the heat output from components such as PSU, cpu, graphics card and etc the less effecient it is. In terms of performance-per-watts war between Intel and AMD, all in the name of effeciency. The more performance the less time consumption and the less watts consumption means less wasted money and less heat output and also less cooling cost. Maybe not much impact on a regular dorm with one pc but in the business world it is important like business offices with hundreds of pc and as well as servers.
 
My dual athlon mp box can idle at 500W incoming power (say a little < 400W of actually power delivery at 75% efficiency).

I stopped running it 24/7 because of this. as 500W steady is 12KWH a day or 360 KWH a month. at NYC inflated 20 cents a KWH that's $70 to run my computer!
 


they dont teach math anymore in school?
 
Valis,

Yes, they may teach math but apparently you missed analysis classes.

There is far insufficient data in the highlighted part to even begin to make an analysis. The poster seems to realize this and hence his question. Sadly you do not.

Fretman, do not buy a Q6600 that is rated with a TDP of 105w.
You want a G0 stepping which is rated at a TDP of 95w.

 


the poster only realized the additional variables after being told of them, his request was straightforward and easily done if one can add and multiply to find out how much 40w of electricity costs.

and i dont remember asking YOU this question, but thanks for adding... nothing...

Valis
 

An excellent point. If you're really worried about your electric bill, you should invest in the most efficient PSU possible. If you get some generic unit that's rated at 60% efficiency, that means that 40% of the electricity it's pulling into it is just lost (I believe most of it becomes heat) That means every component in your system (monitor/speakers excluded of course) is consuming more electricity because of your cheap PSU. I'm pretty sure it's possible to find a PSU with an 85% (or higher) rating now. Anyways, the power consumption difference between those two processors at idle or partial load will likely be much closer than 65W vs. 105W.
 
Thanks to all for the suggestions. I'm going with the Antec Sonata III case which I believe has 500w of cooling.

I also contacted my store and it appears they will be getting the Q6600 with G0 stepping so I believe that would be 95w.

 
watts/1000*rate=cost

For example: At $0.10/Kwh 40 watts costs $35.04/year or $4.38 at only 3hours/day. Actual electrical usage from just changing the CPU will be a bit different than that though and electrical rates in USA range from about $0.06/Kwh to $0.30/Kwh.

There are, however, a number of other reasons to keep electrical usage low. Heat and any additional costs associated with heat disposal (Air Conditioning, noise, reduced component life span, user comfort). Electrical circuit load and stability (your house circuits that is). Pollution caused by electrical generation. The economy.

I would say pick the one that is best suited to your tasks and save energy someplace where you can make a bigger difference and save more money.