What Do High-End Graphics Cards Cost In Terms Of Electricity?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

kartu

Distinguished
Mar 3, 2009
959
0
18,980
0
Have in mind that electricity price in, say, Germany is about 20 euro cents. That's roughly 0.26 USD, twice as much as in US.

[citation][nom]NeBuN[/nom]who cares....[/citation]
Nearly anyone buying not the highest performing card? Which is like 99% of buyers?
 

pelov

Distinguished
Jan 6, 2011
423
0
18,810
15
[citation][nom]shin0bi272[/nom]I like how in the interim conclusion for the "normal user" they say to buy a midrange card to save money on power... cause 20 a year for the 580 is gonna break the bank right? You guys really missed the boat when it came to common sense didnt you? If I buy a 600 dollar video card and play lets say 2 hours a day that's not going to cause my power bill to go up as much as the fact that my apartment complex will not replace my livingroom window which has a 1/4" gap between the two panes.[/citation]

Tomshardware doesn't review window installations or poor practices by building owners.

The article was written to find out just how much power these things consume and how it affects you, and it did it very well. We've known for a while now that graphics cards have been getting more and more power hungry but this attempted -- and succeeded -- in answering just how much hungrier they are now. Also shows that ATi/AMD has been a bit more efficient with their power draw with relation to performance.

I purchased the 5770 knowing that I won't game much and that with 1440x900 res (i don't like big monitors) and that the card was enough for my modest gaming needs. It sips very little power, it stays cool and it's enough for my monitor res. Even though it wasn't listed I imagine it'd be at the bottom of the list had it been there.

If this article doesn't pertain to you or you just can't comprehend why it was written then don't bitch and moan. Some of us have 3-4 rigs in their house that they've built. Though 1 thirsty graphics card may not seem like much, when you're dealing with 2-3 in the same household then it becomes a different story.
 

Olle P

Distinguished
Apr 7, 2010
594
25
19,040
23
I wonder if vertical sync was set to "on" or "off"?
It should make no difference in visual performance, but cut the power use of the high end cards quite a bit to have it active.
 

Onus

Titan
Moderator
While this article, like ANY article that has ever been published, is not a be-all, end-all on the subject, I almost feel like crowing about it; I've been urging people to consider things like power usage and their real needs for years, before arbitrarily wasting their money. IFF (if and only if) it is important to you, this article is a good effort into dealing with a highly subjective can of wrigglies. Sure, it's loaded with disclaimers, but it is an excellent start.
It is temping to say "Wow, AMD cards sure use less power than nVidia cards," but that would miss the point, and a nVidia card is still what you should buy if its combination of features, power, and power use are right for you. It's like the bogus "CAFE" (Corporate Average Fuel Economy) that automakers use to disguise their [customers'] preference for gas guzzlers; in this case the HD6970 looks like a bad deal.
 

irtehyar

Distinguished
Nov 8, 2007
69
0
18,630
0
We really care about this for a high-end gfx card? Heck I installed a mini-split air conditioner in my room to keep the room cooler while playing pc games because of the heat generated by a high end pc. This is like complaining about paying $600 extra a year for premium fuel for your $60,000 sports car.
 

hardcore_gamer

Distinguished
Mar 27, 2010
540
0
18,980
0
[citation][nom]greghome[/nom]Technically, by having a girlfriend, u wouldn't even have the time to be a hardcore gamer anymore........[/citation]

THAT IS NOT TRUE ;)
 

jfby

Distinguished
Jun 4, 2010
418
0
18,810
10
I thoroughly enjoy this type of 'real-world' analysis this is theoretically aplicable to everyday consumer and I appreciate Tom's for doing this kind of report... but I don't think this test is a true representation of what typical people will use.

First I consider myself a semi-hardcore gamer: I play maybe 5-10 hrs a week of games, I have an i7-930 CPU and Radeon 5850 GPU and it suites me just fine. I don't touch in a week what you reported on in one day, so my yearly cost despite being a "hardcore gamer" would be about $5.00 or so per year. If I cross-fired, that would go to $10.00... I spend about that much money on Diet Drinks per week... if I was part of the fold-at-home crowd, these numbers might be more applicable, but even then if I've spent $1-2k on a system, spending $50+ a year on electricity isn't going to bother me too much.

Anyhow, thanks for the deligent work Tom's, it is greatly appreciated.
 

jfby

Distinguished
Jun 4, 2010
418
0
18,810
10
To add to my last comment, I should've said that spending an extra $5-10 more than the reference system, not just $5-10...
 

twile

Distinguished
Apr 28, 2006
177
0
18,680
0
Whoever wrote this article has clearly never lived in a cold climate. A gaming computer is a godsend for heating purposes.

I'm also completely confused about this idea of gamers buying things that are way more than they need. Are you getting a smooth 60 fps with max settings/AA/etc? Unless you're only playing WoW, probably not. Personally, I need something that can push 120 fps (or 60x2, stereoscopic 3D display) which makes it even more imperative that I have high-performance parts. Maybe my info is just really out of date, but I didn't know that this was easy to get.

Now I'm interested in seeing how these different GPUs compare when running a game that's hit the framerate limit for what the display will show versus running a game that's below that limit. The claim made in this article is that you shouldn't get a card which exceeds your needs because of the absurd power draw, but it has yet to be demonstrated that the cards will drink down their max power consumption when they're actually not needed to do max performance. I guess what I'm really interested is seeing how power scales with GPU load across different cards. I can't believe that when you start up a mid-level game which is capped at 60 FPS, your GTX 580s or HD 5970s are going to going to hit their power ceiling.
 

trainreks

Distinguished
Jul 23, 2008
35
0
18,530
0
[citation][nom]greghome[/nom]Technically, by having a girlfriend, u wouldn't even have the time to be a hardcore gamer anymore........[/citation]

so true @:(
 

vvhocare5

Distinguished
Mar 5, 2008
768
0
19,060
32
Man, here in CA PG&E costs us 39c/kwh - w/o over usage charges. Based on your charts (and 8 hours is wayyyy too much time) thats almost $300 per year for one card....but I run Crossfire....
 

amk09

Distinguished
Mar 29, 2010
554
0
19,010
13
This article is unnecessary even though I loved reading it.

I know people always talk about power consumption and all that jargon, and I am definitely guilty of bashing Nvidia for high power usage, but in reality, anyone who can afford any of these cards, shouldn't have any worries about being able to pay the power bill.

High-end gaming cards like these are a LUXURY item. If you are worries about not being able to afford the additional expenses(i.e. raised power bill) that come with them, you shouldn't be purchasing them. Like NeBuN said, if you buy a huge SUV, you can afford the $80 it takes to fill up the gas tank regularly.

With my rant aside, I really loved this article toms, and I'm sure a lot of others appreciated it as well.
 

touchdowntexas13

Distinguished
Apr 13, 2009
759
0
19,010
6
I was considering getting another gtx470 for sli this summer. But now, maybe not. It's not that the electricity price is all that much, but DANG this card sucks some juice.

Hopefully I can hold off until the next generation of cards. For now I can max everything I play just fine. Heck, maybe I should even consider an underclocked profile in MSI afterburner.
 

Owenator

Distinguished
Oct 14, 2008
5
0
18,510
0
Interestig article. I have found that my GTX 470 SLI rig does heat my room nicely. Part of that heat comes from my four monitors too. The room seems much cooler when they are not just sleep mode but off.

Also, I work at a power plant and my kids need to be fed. So please use keep using lots of electricity. Thanks!
 

Spanky Deluxe

Distinguished
Mar 24, 2009
506
0
18,980
0
Ouch, my computer at idle draws 560W according to my power meter. Mind you that's for my computer, three screens, amp and printer. I also keep it on 24x7 (although with the screens asleep it 'only' draws 330W). At 10p/kWh that's £390 a year if I'm using it 50% of the time and sleeping the screens the rest of the time. Oops. Ah well, it's worth it.
 
G

Guest

Guest
Even an average electricity consumer can save about 20$ a month just by going to CFL bulbs instead of incandescent lighting... You could offset your rigs draw with that and turning it off or putting it in sleep mode when not in use. Not to mention unplugging all those AC/DC adapters in everyones homes. Even better don't set the AC too low or the heat up too high. Electricity is cheap but still adds up monthly if your being unwise.
 

saladbarsmash

Distinguished
Feb 16, 2011
1
0
18,510
0
If someone is at home playing an average of 8 or even 5 hours a day of video games, they probably aren't paying for the electric bill anyway. If it cost 90 dollars a year, that's only $7.5 dollars a month... If someone pays 450+ for a video card setup, then whats $7.5 dollars extra a month. If the electric bill is that much of a problem, the buy CFL bulbs along with lowering your heat in the winter and AC in the summer to offset the cost. Problem fixed.
 

quovatis

Distinguished
Apr 14, 2008
39
0
18,530
0
I don't agree with the conclusions at all. If you are gamer, you're not deciding between a low-end and a high-end card. You're deciding between two mid to high end cards. Looks like the annual cost of running between to similar cards is around $10 maximum. Not much to worry about IMHO.
 

tommysch

Distinguished
Sep 6, 2008
1,165
0
19,280
0
If you live in a cold region (Quebec,Canada). This is so irrelevant, pushing as much power through the computer will simply offload the heating system. The same is true for incandescent light bulbs.
 
This is why I am hysterical when power consumption is brought up as a reason to choose one brand GFX card over another. As an enthusiast, when choosing between say a 570 and a 6970 for example, will the 15 cent (their costs, 12 cents my costs) addition to my monthly power bill really be at the heart of my decision ? With the cards at only $10 apart, buying a 6970 for the power cost savings, at 0% interests would take almost 7 years to recoup the $10 "investment".
 
G

Guest

Guest
Dutch consumers should use $0.29/kWh (€0.22 = $0.29)...
$0,13 = €0.096 we can only dream about those prices...
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS