Do you consider power consumption when choosing a CPU?

Do you consider power consumption when choosing a CPU?

  • Yes

    Votes: 21 21.6%
  • No

    Votes: 50 51.5%
  • It's a factor

    Votes: 26 26.8%

  • Total voters
    97
Nope - Not at all.

For business users running large numbers of machines for 24 hours a day, 7 days a week, 365 1/4 days a year, and trying to keep their power bills down, sure. Especially if they can publish press release containing a steaming pile saying they 'saved blah de blah amount of greenhouse gasses by changing....' <barf> while they're doing it.


But for gamers/home users?? <looks at CPU... Looks at 8800GTX.... giggles> Power consumption matters not. Heat production?? Oh Yeah. But 'tron usage?? Pulheeeze... Just shut the thing off when you're done using it.
 

drysocks

Distinguished
Sep 2, 2007
178
0
18,680
Well isn't lower power consumption parallel to lower heat production? I'm all for less heat for the the same performance.
 

Thanatos421

Distinguished
Mar 26, 2007
549
0
18,990
I voted yes, but my vote was based on process more than TDP (although, they tend to scale together). For instance, I would much rather buy a 45nm chip that can run at similar frequencies using less overall power than any other process which might run a little "faster".
 
Absolutley True, they are most definitely related. Efficient designs generally use less power and efficient designs are generally faster. So we do look at power consumption as a sign of efficient design. But the reason we look isn't how many 'trons it's using - it's the heat~performance ratio.

I know real world physics don't work like this, but by way of example: If a processor was 10% faster for twice the power requirement, yet magically produced half the heat!?!? Home/Gaming Enthusiasts would buy it! Conversely, if two processors used the same power and were otherwise equivalent, the cooler-running one gets the nod.
 

andytg7

Distinguished
Apr 25, 2007
71
0
18,630
Sure of course I consider the power consumption factor. Because like UncleDave said, more power generally means more heat production. And more heat production has the potential to raise the room temperature therefore making your AC run more often to bring down the temperature to your desired setting. For someone like me who lives in Miami, FL, the AC usage plays a big role in the electric bill. So building a PC that has great performance and low power consumption, which therefore produces less heat, is something I aim to do.
 


For someone like me who lives in the Pacific Northwest, heat is no problem at all.
 

zenmaster

Splendid
Feb 21, 2006
3,867
0
22,790


Definitely Power Usage is an important factor.
More Power Usage = More Cooling Requirements = More Noise or Very Very Expensive Silent Options.

Right now my system is Highly OC'd and very quite with a mid-level after market HS & Fan.
I can OC another 200Mhz quite easily, but that introduces noise.
Who wants that when you cant tell the extra 200Mhz.

If my CPU was a Hot Chip, I would be hearing lots of noise all of the time.

In my case, it's more important than most since I have 4 systems running in my room.
 

TeraMedia

Distinguished
Jan 26, 2006
904
1
18,990
For a CPU, power consumption IS heat. There's nowhere else for the energy to go. So for my HTPC this is especially important. I just wish my mobo supported a more efficient CPU.
 

UncleDave

Distinguished
Jun 4, 2007
223
0
18,680
Since socks mentioned it first....



I must admit that I didn't think of the basic power = heat equation :non: . My gut still tells me that people don't buy chips based on power consumption. When people overclock they will [strike]always mostly occasionally[/strike] sometimes look at improving the stock cooling, but that is to ensure that their chips don't Chernobyl, this is only heat management, I would argue that power consumption as a consideration goes out of the window as soon as you start to overclock. As the zenmaster says he is worried about managing heat and noise NOT power consumption.

I agree 100% with Scotteq

p.s. At the risk of Trolling :bounce: - I also wonder how much, if any, after market coolers can increase power consumption?
 
Hmmm, I didn't think that was trolling. If the concern is heat, it's a valid question. They almost all have fans though, so I doubt there's much difference at stock.

Some people like to oc to the last possible MHz, and I'm sure they don't give a congressman's butt about power consumption. As Reynod points out though, to many others it will matter.
Personally, having replaced all my bulbs with CFLs years ago, I like knowing my power bill is low. I won't give up a required or expected level of performance, but otherwise yes, the power usage will occur to me as I'm making my choices.

Oh yes, and I also agree with Scotteq.
 

johngoodwin

Distinguished
Dec 15, 2005
94
0
18,630

For me power consumption is a factor, but a CPU is one of many factors. Basically if your PC is on an average of 8 hrs/day, 1 wasted watt is $0.28/yr (if your $0.10/KwHr). 24/7 is around $0.85/KwHr

As you can see, it takes a fair amount of wasted watts for it to make much of a difference annually, but it does make a difference. Let's say your machine uses 100 watts more than it has to for the performance you really need to be happy. That's about $85 a year your machine costs extra (not including the price of hardware).

That said, I'd say most people waste more power on their graphics processor. I don't understand why they haven't made them use about 10W for standard 2D stuff.

John
 

righteous

Distinguished
Oct 25, 2006
197
0
18,680
To me, power consumption on a desktop with a single CPU is not a factor I take into consideration.

If I had a farm of servers, then possibly.

i think the point of power consumption when selling to the desktop gaming market is a bunch of Al Gorish BS and carries no weight at least in my decision making process.

If I can;t pay my electric bill to begin with, what the hell would I be buying a quadcore platform with sli/crossfire for?

 

Grimmy

Splendid
Feb 20, 2006
4,431
0
22,780
Well.. don't forget guys, that its not just the CPU. For example, what video card you run:

Again, my dad's E4300 uses around 107-109watts idle (@2.4ghz) using 7300GT. My system (E4400@3ghz) is very similar to his, but I run the 8800GTS 320mb card. It idles at 170Watts (I did add a HD TV card to my system).

So when I'm running folding on both cores, my system uses 207-210 watts, and thats without the GPU running or doing anything at... all. :cry:

So all my power savings is pretty much shot from wanting a good GPU.

I use a P3-kil-o-watt meter, btw. With both my new and old system running off UPS, my total system drains 497-510watts, adding around another $40 bucks to the electric bill.
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780
Of course it matters.

Obviously more heat also means louder fans, as well as a higher electricity bill from your PC *and* the air conditioner during summer. It also has a knock on affect on your PSU purchase - a lower power chip can get away with a cheaper PSU.

Most overclockers actually DO care about power consumption figures when choosing a CPU - just not the way most 'normal' (non OC) people think about it. Overclockers will always prefer a lower TDP, if possible, to maximise their chances of a big overclock. It gives them more 'thermal leeway'. Why do you think everyone prefers 95W G0 Q6600s over 105W B3 Q6600s? 10W ain't hardly gonna make you broke on the electricity bill, but 10W less may mean a 100MHz higher overclock.
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780


I have a similar system, and I observed that as well (although I didn't actually measure the wattage of my PC, I knew my 8800GTS 320 was chewing up the most power). My solution? UNDERCLOCK it for 2D / desktop stuff!

The default clocks are: 513c/792m
My 'custom' 2D clocks: 111c/155m

Yes, I have underclocked my 8800GTS 320 about 500% under stock speeds... it will not go any lower or I get graphical corruption on the desktop. :p I'm not sure exactly how much power it saves but I would assume a fair amount, since temps drop about 10c over stock.

Btw, I use ATI Tool to make custom clockspeed profiles depending on usage. When I fire up a game, the card is automatically overclocked to 660c/1100m. When I quit the game it reverts back to 111c/155m. Talk about from one extreme to another huh. :lol:
 

Grimmy

Splendid
Feb 20, 2006
4,431
0
22,780
I did underclock the GPU. It only lowered it down to about 13watts.

Although I really don't want to keep switching it back and forth when I do want to play games. Just don't know why my 8800GTS needs to consume about 60watts more at idle to begin with. Its doing nothing more then what my old 6800GS (agp) card did when just viewing 2D desktop... :cry:
 

epsilon84

Distinguished
Oct 24, 2006
1,689
0
19,780


How far did you underclock it though?

I've had my 8800GTS at this config for the past 6 months now, no problems so far.
 

Grimmy

Splendid
Feb 20, 2006
4,431
0
22,780


As far as RivaTuner could go:

8800-p3-nor.jpg


Normal (Evga 8800GTS 320MB KO) at 588mhz

8800-p3-red.jpg


Turned down at 294mhz... :lol:

Now you have to remember these readings were before I put my new TV card in... so just adding that card, was 7watts, and thats not even watching tv. And one weird thing I noticed, when I close the TV program, the consumption stays the same like I didn't even turn the tv card off.. I have to switch it to composite, then the consumption goes back to the... err.. normal 170Watts.
 

evilr00t

Distinguished
Aug 15, 2006
882
0
18,980


Use ATITool, I got my 8800GTS 320M down to 85MHz (and it still runs ut2004 at 60fps!)