Curbing Your GPU's Power Use: Is It Worthwhile?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Given the cost to run a current PC these days I don't what the point would be in messing around with this. Graphics cards can use alot of juice and if it can be easy to reduce consumtion when performance is not needed then go for it.

Your time would be better used doing things around your home to reduce power consumtion like swapping to reduced power bulbs, insulating windows, more efficient windows, and making sure your home is up to par on its insulation. Climate control units use an enormous amount of power and the money you could save per year by making sure they are not running needlessly because of poor insulation could save you enough to power 10 PC's for a year.

 
Why did you choose such a complicated way? I just overclock and underclock my r5870 with ATT by pressing f2 and f3. Even downclocked to 150/300, its more than capable of doing flash and 1080p.
 
[citation][nom]spacefrog[/nom]Why did you choose such a complicated way? I just overclock and underclock my r5870 with ATT by pressing f2 and f3. Even downclocked to 150/300, its more than capable of doing flash and 1080p.[/citation]

As noted in the article, just changing clocks (both core and memory) don't necessarily mean you're getting lower consumption.
 
Hmmmm.... Here is my thought on the situation. I can sit at home and do my thing on the computer, play games, photoshop, surf the net, folding at home, ect. ect., which might use 400-1200 Watts, depending if I'm gaming or not, OR, because I get bored easily, like most of the people I know, I can just drive around aimlessly looking for something to do, possibly stirring up a little trouble along the way... Just saying., which burns gas and is far less efficient then my pc, do some boating (even worse), watch t.v. on my home theater, I have a tube T.V. and a 1000 watt home theater system.. Ect. Ect. Ect.

I realize that 1200 watt computer seams less efficient then say a laptop or a tablet, but it gets "STUFF" done a lot quicker, so in the end, instead of waiting 24 hours to convert a movie on my old 450watt PC (pentium 4), I can now convert the same movie on my 1200 watt machine, in half an hour. So, instead of looking at it in terms of power consumption, maybe people should be looking at it in terms of productivity. In the end, you are actually saving electricity by being more productive.

The way I look at it, it might be costing me $100 bucks more a year, but at least it's saving me time, and i'm not out causing trouble or doing drugs. I bet the same people who complain about PC power usage, probably have a high powered sports car in there garage, or a huge 4 or 5 bedroom house with only 1 or 2 people living in it.

As a consumer, I am not here to worry about power consumption. I mean, if I can save a few bucks buying more efficient light bulbs, as long as they don't lay waste to our communities by putting mercury in the land fills that seeps into our water supply, then i'll do my best, but as far as the PC goes, it's all about productivity which = efficiency. Besides, how many people are running folding@home? That might take a lot of power, but it's gonna help cure a lot of diseases one day.

I guarantee that if the world ends... It's not gonna be because of people running high end PC's. It'll most likely be from automobile polution, manufacturing pollution, disease, war, or a cybernetic war via the matrix, the terminator, ghost in the machine, or some variant of the three.. In 10 years, chips are gonna be 1000 times more powerful and run on .002 watts. If i'm wrong then i'll worry about it then.

Not trying to rant, I'm bored tonight....
 
[citation]As noted in the article, just changing clocks (both core and memory) don't necessarily mean you're getting lower consumption.[/citation]
I also downvoltage the card to 0.950V when its underclocked and then i checked on the power outlet, it uses a lot less power then when its clocked normal.
 
[citation][nom]amstech[/nom]The answer is NO.Same for CPU's.Cutting power doesn't really save you money or power.Period.[/citation]

Except for when your gfx card is idling in 2d mode yet still consuming 100w... Driver updates could easily cut power significantly to the gfx card when load=0, and yes, this would save you money and power. Multiply that by millions of users and you have a significant reduction in pollution and waste. Maybe you should consider thinking before you type next time. Period.
 
[citation][nom]spacefrog[/nom]I also downvoltage the card to 0.950V when its underclocked and then i checked on the power outlet, it uses a lot less power then when its clocked normal.[/citation]

Well, then good for you. If i recall correctly, the voltage adjustment feature of ATI Tray Tools only works on supported cards. We did try ATT with our sample and the voltage settings didn't work.
 
If your an High end gamer and you can afford the equipment to run games at there max, then you shouldnt be worrying about your light bill, it comes with the terroritory.

You dont buy a Ferrari and then expect to get good fuel mileage.

 
I'm an efficiency and quiet fan cooling enthusiast and am about to upgrade from an old revision 2 Geforce 7600 GT 512 MB DDR2 to the Radeon 6670 1 Gig GDDR5:)

My CPU's stock clock and voltage are 2.8 Ghz at 1.3750 volts with a 65 Watt TDP

Current stats are 3.4 Ghz at 1.3 Volts

So I'm running both cooler and faster with it and would love to do the same with the new GPU:)

At the very least I'd like to run at stock and am considering crossfire,possibly with single slot cooling...

I've got an Antec 300 with great ventilation and a big ASRock board but I'm concerned about it getting crowded with the bulky dual slot heatsinks and all that..

Any reduction in heat output would make me feel better about such a setup.
 
I couldnt adjust the bios of my HD6850. Therefore I decided to play with msi afterburner to underclock to 600/1000 mhz (min allowed, any lower shows no changes in gpuz). With the voltage at 1.148v, I started Unigine benchmark, current draw was 28 A. I then change the voltage to 1.0 v and started Unigine benchmark, this time the current draw was 22 A, a reduction of 21% of current. Not sure what that translate to in watts (If anyone of you guys could figure the wattage that would be great). Doing this seems to ensure that there is no unnecessary wastage while doing 2d and light 3d stuff.

However I noted that the min current draw at idle increased from 3A to 4A after the voltage changed from 1.148 to 1.0 (Not sure what this translate to in watt again). Guess this marginal increase is ok as no one would start a computer just to let it idle.

Just an observation from me. Cheers.
 
[citation][nom]feeddagoat[/nom]With proper engine mapping, good throttle control, weight reduction from materials used, loosing carpets, spare tyre, radio etc. and the aerodynamic design of a Ferrari it isn't as stupid as you think.[/citation]
Appears intelligent... until "loosing".

/facepalm
 
[citation][nom]Anonymous[/nom]I think, considering those people using SLi and crossfire and higher end videocards, they don't really give a gat about how much elec. they are using. They can afford to buy two expensive PCBs, why would they care about extra 5~10 bucks per month? If poeple are focused on lower power consumption, they would go for lower performance components, arent they?[/citation]

sometimes waste is waste and if people want to curb their waste, good for them.
 
[citation][nom]uii[/nom]The problem with reducing power through limiting clocks etc is you kill your min frames as well as your max. No one cares about max frame rates(well no one who has any sense), but a hit to min frame rates can turn a playable game into crap.A better way to limit power is to use vsync. Why do you need 300 fps in a game, 75(or whatever your monitors refresh rate is) is completely fluid, so turn on vsync. You'll save power by not rendering stuff that doesnt even matter. Obviously this only saves power if your card is fast enough that its rendering extra frames that you wont even see.Even that is too much however, if a game has a limit fps setting use it. 30 is completely smooth. Again its min framerates that matter and limiting your max framerate to about 30 will again save you lots of power.The easiest thing they could do is add a global limit to frame rate in the driver. Id use it.The reason i use framerate limiting however is not to save $ on power, its to save on heat. Less heat means the fan runs slower means i don't hear a hurricane.[/citation]
The problem is vsync causes input lag, and in any competitive game that is a huge disadvantage. Also, input lag in most games becomes noticeable under 55 fps for me, I cant play games under that fps.
 
Necro comment....

Forgive my skimming, but was there anything compared between taking one of the higher end cards and undervolting and then comparing it to just getting a cheaper card?

Second point, heat. Power and noise are obvious, but you start running one of these suckers in a small closed room for a few hours and you can start making nachos on your mousepad. How much for the AC to keep that temp down?

Still, we are probably looking at no more than a buck a day, but the question is simple, do I really need to see a robot sweat in-game? Personally, I would love a more efficient, cooler, quieter board that would also be able to play all the current games at full res and "medium" details rather than start up an SLI dust-buster, AC and seperate 15 amp line to the computer room.....
 
fokissed, you are only partially right.

I think you would be able to play at under 55 if it STAYED at the same FR. The main problem happens when the rates keep changing up and down.

If they were able to make a card that got a solid 60 on everything, nobody would notice. Hell, even a CONSTANT 30 would be akin to many of the console games we have had....
 
After switching to XBox a while back for all my gaming. I uninstalled my dedicated graphic card and used the onboard graphics. Actually I found that in 2D the onboard graphics were perfectly fine and eliminated the power consumption drag from the dedicated card.
 
Status
Not open for further replies.