News Graphics Card Power Consumption Tested: Which GPUs Slurp the Most Juice?

Thanks for this.

I have to admit, and maybe it's just the cooling limitation, but I did not at all expect the Vega 56 and Vega 64 to measure pretty much exactly what their official TDP numbers of 210W and 295W are.


Don't taunt me like this, LOL
LOL. I don't have all of the older generation GPUs, and in fact I'm missing a whole lot of potentially interesting models. But I do have 980 Ti, 980, 970, Fury X, and R9 390 that might be fun to check.

Yeah, what the heck -- I'm going to stuff in the 900 series while I work on some other writing. Thankfully, it's not too hard to just capture the data in the background while I work. Swap card, run Metro for 10 minutes, run FurMark for 10 minutes. Repeat. Check back to see new charts in... maybe by the end of the day. (I have I think a 770 and 780 as well sitting around. And R9 380 if I'm feeling ambitious.)

On Vega, the one thing that's shocking to me is the gap between GPU-Z and Powenetics. The Vega 64 appears to use about 80W on "other stuff" besides the GPU, if GPU-Z's power numbers are accurately reporting GPU-only power. The Vega 56 isn't nearly as high -- about a 45W difference between GPU-Z and Powenetics. That suggests the 18% increase in HBM2 clocks causes about a 75% increase in HBM2 power use (though some of it is probably VRMs and such as well).

Full figures:
Vega 64 Powenetics: 296.7W avg power during Metro, GPU-Z says 219.2W.
Vega 56 Powenetics: 209.4W avg power during Metro, GPU-Z says 164.6W.
 
  • Like
Reactions: King_V and bit_user

King_V

Illustrious
Ambassador
LOL. I don't have all of the older generation GPUs, and in fact I'm missing a whole lot of potentially interesting models. But I do have 980 Ti, 980, 970, Fury X, and R9 390 that might be fun to check.

Yeah, what the heck -- I'm going to stuff in the 900 series while I work on some other writing. Thankfully, it's not too hard to just capture the data in the background while I work. Swap card, run Metro for 10 minutes, run FurMark for 10 minutes. Repeat. Check back to see new charts in... maybe by the end of the day. (I have I think a 770 and 780 as well sitting around. And R9 380 if I'm feeling ambitious.)

On Vega, the one thing that's shocking to me is the gap between GPU-Z and Powenetics. The Vega 64 appears to use about 80W on "other stuff" besides the GPU, if GPU-Z's power numbers are accurately reporting GPU-only power. The Vega 56 isn't nearly as high -- about a 45W difference between GPU-Z and Powenetics. That suggests the 18% increase in HBM2 clocks causes about a 75% increase in HBM2 power use (though some of it is probably VRMs and such as well).

Full figures:
Vega 64 Powenetics: 296.7W avg power during Metro, GPU-Z says 219.2W.
Vega 56 Powenetics: 209.4W avg power during Metro, GPU-Z says 164.6W.

I'd most certainly be curious as to the R9 380 results, but that's because, if I understand it correctly, it's basically a slightly overclocked R9 285. Since I had an R9 285 (Gigabyte's Windforce OC version), well, the curiosity is there, LOL

And yep, with the Vegas, I was talking about the Powenetics numbers. I had assumed they'd blow past their official TDP numbers, but no, they're right in line, give or take 1-2 watts.
 
I'd most certainly be curious as to the R9 380 results, but that's because, if I understand it correctly, it's basically a slightly overclocked R9 285. Since I had an R9 285 (Gigabyte's Windforce OC version), well, the curiosity is there, LOL

And yep, with the Vegas, I was talking about the Powenetics numbers. I had assumed they'd blow past their official TDP numbers, but no, they're right in line, give or take 1-2 watts.
So, my old R9 380 4GB card is dead. Actually, it works, but one of the fans is busted and it crashed multiple times in both Metro Exodus and FurMark, so I'm not going to put any more effort into that one. RIP, 380...

Anyway, I didn't show GPU clockspeeds, which is another dimension of the Vega numbers. The throttling in some of the tests is ... severe. This is why people undervolt, because otherwise the power use is just brutal on Vega. But undervolting can create instability and isn't a panacea.

In Metro, the Vega 56 averages GPU clocks of 1230MHz -- not too bad, considering the official spec is 1156MHz base, 1471MHz boost. Vega 64 averages 1494MHz, again with base of 1247MHz and boost of 1546MHz. But FurMark... Vega 56 drops to 835MHz average clocks, and Vega 64 is at 1270MHz. That's on 'reference' models. The PowerColor V56 is higher power and clocks, obviously. MSI V64 is also better, possibly just because of binning and being made a couple of months after the initial Vega launch.
 
  • Like
Reactions: King_V
For those paying attention here, I've added results for the GTX 980 Ti, 980, 970, and 780 from Nvidia. The 980 and 980 Ti are reference cards, while the 970 is a Zotac with reference clocks and the 780 is an EVGA model. I've also added the AMD R9 Fury X (reference) and Sapphire R9 390 Nitro. Their real-time power use is shown on a fourth 'legacy' chart.

Separate note: dang, some of these old cards are on their last leg. My R9 380 basically can't make it through either Metro Exodus or FurMark, and the 780 needed some help (by way of a boosted fan speed). If you're still using any of these older generation GPUs, I feel for you.
 
  • Like
Reactions: bit_user
There must be hundreds of thousands of RX580s out there but you chose NOT to test them?
Perhaps you have other articles that do that?
You have something against Polaris?
Oh, OBVIOUSLY he has something against Polaris, which is why he tested the RX 570 and RX 590. Clearly. :rolleyes:
The actual story is pretty mundane. I worked for PC Gamer for years before coming to Tom's Hardware, and PC Gamer got most of my old test equipment, including the RX 580. I got different GPUs for Tom's Hardware, but apparently there wasn't an RX 580 in the pile, and the Future offices are still closed. I'm working on getting a 580 to add to the charts, just for the sake of completion, but it will most likely be slower than a 590 and use more power.

And yes, I could "just buy it" if needed. But I'm a family man and I'm not going to just spend $150 for the sake of completing my GPU set. The fact is, RX 580 is basically old news and generally not worth buying. RX 590 shows how a slightly faster and more power efficient Polaris 30 GPU performs, and RX 570 shows a slightly slower GPU with only 4GB VRAM. 580 splits the difference. But if I can get one, I'll test it for the charts.
 

jgraham11

Distinguished
Jan 15, 2010
54
21
18,535
Generally I like this article, nicely done. But I want more. One game and one synthetic benchmark, doesn't really paint a good representation of how these cards actually perform in real world:

Idle temps were something that were truncated off of the time vs power charts, it would have been nice to know how they compare as computers, mine anyways sit idle for most of the day.

Does having an extra monitor change anything?

What about games that use DX12 vs Vulcan vs DX11?

Also having temperature and frequency overlaid with Power would be great.

How long do cards boost for?

Would be nice to see an at the wall measurement, may be a good way of seeing driver overhead on the CPU.

Do you plan on using some of the power saving features: AMD chill
 
Generally I like this article, nicely done. But I want more. One game and one synthetic benchmark, doesn't really paint a good representation of how these cards actually perform in real world:
It took quite a bit of time to get everything set up for testing and then run through the 40-ish cards. If/when a new game comes out that's worth testing, I can see about capturing power data. The main difficulty with getting good power data is you ideally need a test that lasts long enough for the GPU to warm up and level off, and a lot of benchmarks only last 60-90 seconds. Metro loops but still has dips. But yeah, it's a topic I plan to keep an eye on, as Metro Exodus for sure isn't representative of all games.
Idle temps were something that were truncated off of the time vs power charts, it would have been nice to know how they compare as computers, mine anyways sit idle for most of the day.
I assume you mean power, and I clipped off that data for these charts. It's interesting to see how the various GPUs behave, though. Newer models are definitely better at getting down to low power idle states faster. Everything GTX 10-series and later or RX 400 and later looks decent, but I know the GTX 780 tended to idle at 40-70W for much longer than the 980, 1080, and 2080. Again, I can see about capturing this for future articles but didn't focus on it here.
Does having an extra monitor change anything?
What about games that use DX12 vs Vulcan vs DX11?
Also having temperature and frequency overlaid with Power would be great.
Probably, see above on games with different APIs (this was Metro using DX12), and I think putting temp, power, and frequency on a single chart would be a mess. I do have the temp, power, fan speed, GPU load, etc. data though -- I just didn't report it here.
How long do cards boost for?
Would be nice to see an at the wall measurement, may be a good way of seeing driver overhead on the CPU.
Do you plan on using some of the power saving features: AMD chill
Boost depends on many things, including the card design and fan speed. Most of the GPUs have relatively stable MHz during gaming tests, but some models definitely fluctuate more. FurMark meanwhile puts enough of a strain on the GPUs that quite a few really throttle down after the first 10-60 seconds.

AMD Chill is mostly out of the scope of this testing, but it's something I could look at in the future. The problem is that it's a LOT of time for each additional type of test, and many of the resulting data points aren't particularly important. Putting dozens of hours into testing for an article that most people won't read isn't a good use of the available resources (namely, me and my time).

In other words, additional testing while desirable for a variety of reasons often isn't practical. Sort of like doing extensive overclocking, undervolting, etc. testing is beyond the scope of a review. The reality of PCs is that probably 99% of users run stock, so that's where our testing time is best spent.
 

bit_user

Polypheme
Ambassador
For those paying attention here, I've added results for the GTX 980 Ti, 980, 970, and 780 from Nvidia. The 980 and 980 Ti are reference cards, while the 970 is a Zotac with reference clocks and the 780 is an EVGA model. I've also added the AMD R9 Fury X (reference) and Sapphire R9 390 Nitro.
Thanks. I had a good laugh seeing the R9 390 and Fury X top the Fumark power charts. I was somewhat expecting it, but it's still a bit of a shock to see how far out Fury X places.

It would've been the icing on the cake to see them in the efficiency rankings table, as well.
 
  • Like
Reactions: JarredWaltonGPU

bit_user

Polypheme
Ambassador
One game and one synthetic benchmark, doesn't really paint a good representation of how these cards actually perform in real world:
It's not supposed to. It's just looking at their power consumption in gaming and torture scenarios. For that purpose, all you really need is a couple typical and worst-case tests.

Idle temps were something that were truncated off of the time vs power charts, it would have been nice to know how they compare as computers, mine anyways sit idle for most of the day.
You mean idle power? Yeah, that'd be nice to see.
 

bit_user

Polypheme
Ambassador
Without a scatter plot of performance vs power, the entire article is useless.
Certainly not, if you want to know what PSU spec you need for a given GPU. In that case, power utilization is useful, by itself.

An uni-dimensional performance per watt score is useless.
It serves the purpose of tracking how GPUs' efficiency is evolving, over time. That's interesting, even if it's not of great practical relevance for those in the midst of making purchasing decisions.

In discussion threads about GPUs, perf/W is a topic that comes up quite regularly. So, I definitely think this singular metric is worthy of measuring & tracking.
 
The actual story is pretty mundane. I worked for PC Gamer for years before coming to Tom's Hardware, and PC Gamer got most of my old test equipment, including the RX 580. I got different GPUs for Tom's Hardware, but apparently there wasn't an RX 580 in the pile, and the Future offices are still closed. I'm working on getting a 580 to add to the charts, just for the sake of completion, but it will most likely be slower than a 590 and use more power.

And yes, I could "just buy it" if needed. But I'm a family man and I'm not going to just spend $150 for the sake of completing my GPU set. The fact is, RX 580 is basically old news and generally not worth buying. RX 590 shows how a slightly faster and more power efficient Polaris 30 GPU performs, and RX 570 shows a slightly slower GPU with only 4GB VRAM. 580 splits the difference. But if I can get one, I'll test it for the charts.
Thank you. Appreciate the testing.
 

Wendigo

Distinguished
Nov 26, 2002
133
35
18,620
There must be hundreds of thousands of RX580s out there but you chose NOT to test them?
Perhaps you have other articles that do that?
You have something against Polaris?
I also was surprised to not see the RX580 tested. According to the Steam hardware survey it's the most common AMD card right now and the only AMD product making it in the top 10... If there was one AMD card to be tested, it was this one.

EDIT: I saw the explanations of JW on why the 580 is absent. Understandable but still unfortunate.
 
Last edited: