What Do High-End Graphics Cards Cost In Terms Of Electricity?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

deanjo

Distinguished
Sep 30, 2008
113
0
18,680
So in other words, for the average person, swapping their lights in their residence over to ccfl's or led bulbs will have a more dramatic impact on their yearly electricity bill and easily offsets any additional power consumption that one would experience by choosing a top of the line card over a mid range or low range card.
 

deanjo

Distinguished
Sep 30, 2008
113
0
18,680
So in other words, for the average person, swapping their lights in their residence over to ccfl's or led bulbs will have a more dramatic impact on their yearly electricity bill and easily offsets any additional power consumption that one would experience by choosing a top of the line card over a mid range or low range card.
 

bayouboy

Distinguished
Sep 12, 2009
54
0
18,630


If your goal is to save energy and have less impact on the environment, then your GPU is not even a consideration. We are talking about only 400 kWh between top end and bottom end for an entire year! This is like running your oven for 12 hours. Or better yet, doing a dozen loads of laundry with the dryer.

You want to save energy? Don't use a dryer, air dry, this is the single largest use of energy in a home underneath heating and water. Switch out incandescence with CFL or LED, this will save you far more MWh annually than what your computer uses. Set your house temperature to 55F in the winter, which is quite comfortable in my opinion, and use as little AC as possible in the summer. If you live in a dry climate, then use swamp coolers and not heat pumps to cool your home. Not only do swamp coolers improve the air quality in dry climates, but it uses far less energy than a heat pump. People who live in a humid climate have no choice but to use a heat pump. Improve the insulation on your home, make sure you have double pane windows, fix the jams on your doors to reduce air leaks, don't have vaulted rooms in your house. All of these things will save you far more energy than the measly 400kWh annually that we are talking about here. Your entire argument is asinine. You are focusing on the pennies and ignoring the dollars.

If you really wanted to save energy, use a netbook over a desktop. Seriously, this whole article was a ton or work to just prove in the end that the power consumption of high end vs. low end is not a consideration when purchasing. The level of performance you need and want with how much your are willing to spend are far more important that power consumption.

Also, one HUGE flaw in this article is that during the cold months, there is no loss of money from running your computer. All that converted heat is energy that would of gone to heating anyways. You can literally view a computer as running for free when you are heating your home anyways if you have electric heating like I do.
 

hangfirew8

Distinguished
Jun 19, 2009
108
0
18,680
[citation][nom]bayouboy[/nom]Set your house temperature to 55F in the winter, which is quite comfortable in my opinion, and use as little AC as possible in the summer.[/citation]

Louisiana? Slab/Rancher/Double-wide?

This 55F advice is a very bad idea for more Northern climates and homes with basements. Upstairs thermstat set at 55 can lead to frozen pipes in the basement, especially older, uninsulated, unconditioned basements.

I also dispute the "very comfortable", but that is a personal issue.
 
G

Guest

Guest
Well I think this should be useful for price/performance comparisons. If running a 470 uses $50 more for the life of the card than a 6870, then I could purchase a a 6950 and come out ahead in performance for the same cost. Thus I think the annual cost should factor in to the Best Graphics Cards for the Money thing
 

azcoyote

Distinguished
Jun 3, 2008
171
0
18,680
Are power consumption ratings listed on cards that get tested at Tom's? Seems like a simple test and calculation that allow some seriously great comparisons for buyers who balance performance/wattage to how often we get to use the systems. Now I am curious....
 

bit_user

Titan
Ambassador

This is only partially true. Electric heating is less cost-effective than other forms. So, even when you have the home heating on, you're still paying more for the heat coming out of your PC than what's coming out of your heating vents.

However, it's worth pointing out that to someone who uses air conditioning for at least part of the year, they're using even more energy to get rid of all the heat coming out of their PC. This multiplies the running cost of high-power cards, especially because air conditioning is less efficient than heating (i.e. it would cost you more to cool a room by 1 degree than to heat it over the same range).
 
G

Guest

Guest
In the winter I think the excess heat simply offsets the amount of heat needed to keep yourself warm. On the other hand, in the summer, whatever heat you generate will need to be cooled. Thus, those watts are more than doubled.
 

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010
• The young gamer who mostly plays graphically-demanding shooters
• The average user who only rarely upgrades, but buys future-proof high-end hardware once in a while
• The older enthusiast who mostly buys for the fun of it

I bet you think girls don't PvP too.

:D

 

Olle P

Distinguished
Apr 7, 2010
720
61
19,090
[citation][nom]bit_user[/nom]Electric heating is less cost-effective than other forms. ...[/citation]It all boils down to what the regular heating is in the house. With electric radiators for the standard heating the addition of a high power computer makes very little difference when it's cold outside.
[citation][nom]bit_user[/nom]However, it's worth pointing out that to someone who uses air conditioning for at least part of the year, they're using even more energy to get rid of all the heat coming out of their PC. ...[/citation]So very true!

There are other aspects to the power consumption as well:
1. Environmental, as mentioned previously. Every effort to cut the total power consumption counts!
2. Cascade savings. By using a lower power graphics card you can use a PSU with lower power rating, which is cheaper.
3. Noise! Less generated power means less heat to vent out of the computer. Less cooling means less noise, and is great if you want a quiet computer.
 

C00lIT

Distinguished
Oct 29, 2009
437
0
18,810
[citation][nom]greghome[/nom]Technically, by having a girlfriend, u wouldn't even have the time to be a hardcore gamer anymore........[/citation]
From Gamer to having a girlfriend... I miss the money savings of my not so energy efficient videogames.
 

Kewlx25

Distinguished
don't forget you pay double costs when using AC. If the video consumes 200watts of electricity, almost 100% of that goes into heat before leaving the house.

A high end AC unit is ~90% efficient, but the motor may only be ~80% efficient.

Assume you only get ~75% return on using AC. So, for every $1 you spend powering your video card, you need to spend another $1.33 to remove the extra heat. So, every $1 costs $2.33 total.

Yes, in the winter, heat generated means less heat to warm the house, but in the summer, heat generated needs to be removed.
 

Fadekyn

Distinguished
Jan 15, 2011
64
0
18,640
The argument of, "Instead, buying just as much performance as you really need for smooth frame rates (maybe with overclocking headroom) means that the money saved on a lower purchase price and energy consumption will make it much easier to afford yet another new graphics card next year.", is in and of itself uneconomical. If you add to the equation the energy cost you are responsible for through the disposal of the old card and the creation of the new card your individual carbon footprint is multiplied by a much greater factor in that scenario. Which should be a greater overall cost outlay than your energy consumption.

Knowing now the enviormental impact of the heavy metals and chemicals used to dispose of and make these cards, ignoring this scenario in favor of focusing solely on your out of pocket electrical cost is irresponsible of this article. The underlying reason being that taxes, clean up programs, future cost of components and other factors are all impacted by the amount of time you retain your component and this impacts your out of pocket expenses just as much. So while it is much easier to calculate out your expenses by solely focusing on your electrical consumption, I argue you save just as much by buying that higher end card that will last you longer and deferred your future purchase of a newer card. Since through deferrment your other less measurable impacts will be lesser and by default should have a lower cumulative cost effect.
 

carlhenry

Distinguished
Aug 18, 2009
197
0
18,690
you guys missed the entire point of this article.

"It is just like buying a car: even if you have just enough money for a Porsche, tires, gas, insurance, and taxes still have to be paid. And if that makes things a bit tighter, driving is less fun."

it says, if you have just ENOUGH money to buy a particular card -- say like me, i opt for cards such as 6850/6870/gtx460 price range (heck i even decided and bought an OCed GTS 450 instead of those cards for my 1680x1050 lcd), but i can also save for a couple of months to chip in for a gtx470/5870 price range, the drawbacks that comes with those upper tier cards could do me more harm than any good for a couple of FPS and eyecandies turned on. right now i'm very satisfied with my card and my electricity bill :)

i was planning on upgrading to a 24" and a GTX470, but i guess a 6850/6870 would be more fitting for people like me. thanks for the article tom's! really, great job here! helped me decide what to buy for my next upgrade.

ps. my GTS 450 runs everything on max playing Batman AA, NBA 2k11, CoD: MW2, and Mafia II (with physx hack lol that's why i was planning on upping to a GTX 470 but then again......), so... i am quite happy with it. hoping this card would have no hiccups on NFS and Black OPS for my next purchases hihi...
 

NecessaryEvil4

Distinguished
Jun 9, 2010
27
0
18,530
Even "The Gamer" profile is probably light on the power bill in comparison to the way I use my computer. When I'm not using my GTX 470 for gaming, it's folding the rest of the time. I'd like to see an update to this article demonstrating the effects 24/7 usage at 99% load (as it is indicated while folding).
 
G

Guest

Guest
geez, I can't afford to switch my 480, guess just have to stay with it for a long while :D, but then again most games only uses 70% of the gpu, that means I will switch my gpu when games use 90-100% of it
 

3dz1959

Distinguished
Feb 21, 2011
1
0
18,510
i know this was a power consumption artical but they should have thron in atleast proformance chart to see if a hundred $ a year is worth it (say 480/570) i mean if a 580 gets better proformance and less cost then a 480 it would be worth the move so now go see the reviews to figure it out
 

Wolvan

Distinguished
Oct 20, 2009
24
0
18,510
Get "real time" pricing from your electric company if its offered. The way electric rates are charged normally is your usage for the day times the highest price seen during the day. With the real time pricing, it looks at the price every hour vice the whole day.

What does this do for you? I'm married and I work a M-F job. My game computer is only run evenings and weekends. Prices for electricity peak M-F during the day time hours due to industrial/commercial use. They drop way down in the evenings and on weekends. So the average price I pay for electricity for my computer is down in the 2-4 c/kw vice the 13 c/kw you will see during peak times.

Anyway you look at it, this program will save you money if you watch when your using your energy consuming items (tv, computer, washer, stove, dryer etc), which is usually during evenings and weekends anyway. In Illinois Com-Ed offers the program for free.

But, as a gamer with a job etc, I don't really care if its an extra $90 bucks a year for entertainment. Its still very cheap compared to other forms of entertainment out their.
 

reaversedge

Distinguished
Oct 8, 2010
63
0
18,640


actually that depends, whenever you have work, girlfriend and extra time, i only have a maximum of 2hours extra time and sometime i do gaming because i needed som rest also, but some people can manage, but by being a hardcore gamer eats a lot of your time. in my case, compared before (years ago) that i had no girlfriend compared to the time i had, the gaming time is significantly different.
 

cadder

Distinguished
Nov 17, 2008
1,711
1
19,865
They are also neglecting the positive side effects like not needing a space heater in the winter....you recoup alot of energy right there

This is a good point- the power used by the computer is turned into heat. At any time that you would be running a heater, the heat coming from the computer reduces the amount of heat that you would need to generate from a heater. If you heat your house with electric heat then there is no additional cost to run the graphics card. If you heat with natural gas or oil, then you can probably generate heat cheaper than electric heating would cost.

OTOH any time you are running an air conditioner, you have to pay for the electricity that your computer converts into heat, and then you have to pay for additional electricity to run your air conditioner and move that heat outside, so to some extent you are paying double.
 

Ahumado

Distinguished
Jan 31, 2003
90
9
18,635
Gotta tell ya. I don't care about green a whit. I don't care how much power my card uses..I pay for it. Green is a farce.....at work we have green laser toner cartridges. So, if I pour that stuff in your face it isn't going to blind you? I know I'm an idiot jerk. That's OK. Think I will put more wood in the wood stove....mmmmm Warm
 
Status
Not open for further replies.