News Graphics Card Power Consumption Tested: Which GPUs Slurp the Most Juice?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
It would've been the icing on the cake to see them in the efficiency rankings table, as well.
I didn't bother putting them into the tablet, but I may change that. Here are the numbers:
R9 390
31.8%​
R9 Fury X
39.1%​
GTX 970
51.1%​
GTX 980
52.2%​
GTX 980 Ti
51.3%​

Hmmm... our forums may have better table support than our CMS. Not that it's hard to do. Seriously, all the editors know our HTML tables are awful and we fight with them regularly, but with a single CMS for most of the Future-owned sites, getting changes is hard.
 
  • Like
Reactions: bit_user

King_V

Illustrious
Ambassador
Thanks. I had a good laugh seeing the R9 390 and Fury X top the Fumark power charts. I was somewhat expecting it, but it's still a bit of a shock to see how far out Fury X places.

It would've been the icing on the cake to see them in the efficiency rankings table, as well.
We can't have icing at THOSE temps! :LOL:

Sorry, I'll show myself out now...
 

cfbcfb

Reputable
Jan 17, 2020
96
58
4,610
"The RX 570 4GB (an MSI Gaming X model) actually exceeds the official power spec for an 8-pin PEG connector with FurMark, pulling nearly 180W. "

The 570 can draw an additional 75W from the PCIE slot, for a total of 225W. My 4GB RX570 is plugged into a crap proprietary power supply via 8 pin connector, IIRC its a single 14A rail. Runs over 150W stable for hours OC'd to 1400 and IIRC the memory at 2000. folding@home puts a beatdown on the cpu and gpu. I've actually drawn more power and made more heat with folding@home than furmark or other cpu burnin tools.
 
"The RX 570 4GB (an MSI Gaming X model) actually exceeds the official power spec for an 8-pin PEG connector with FurMark, pulling nearly 180W. "

The 570 can draw an additional 75W from the PCIE slot, for a total of 225W. My 4GB RX570 is plugged into a crap proprietary power supply via 8 pin connector, IIRC its a single 14A rail. Runs over 150W stable for hours OC'd to 1400 and IIRC the memory at 2000. folding@home puts a beatdown on the cpu and gpu. I've actually drawn more power and made more heat with folding@home than furmark or other cpu burnin tools.
I think you're missing my meaning. The quote is exactly what I say: the MSI RX 570 Gaming X 4G can draw up to 180W over the 8-pin PEG connector. (Peak was 182.8W.) The Powenetics hardware collects data for the PCIe x16 slot, as well as up to three 8-pin PEG connectors. The MSI card uses only one 8-pin connector.

While total card power use is right at the 225W limit that the 8-pin plus PCIe x16 slot can provide, the card is well under 75W on the PCIe slot , but is going 30W over spec on the 8-pin. To be precise, the PCIe slot averages 45.5W of power delivery while the 8-pin PEG averages 179.7W.

8-pin connectors are only supposed to provide up to 150W of power, at least if you're staying in spec. Granted, it's far safer to exceed power on the PEG connector than on the PCIe slot (because the latter can impact the motherboard), and in games the card seems to happily target 150W TDP. Also, many PSUs will have two separate 8-pin connectors on one cable harness, meaning the cables are actually capable of pulling 300W total for two 8-pin connectors (in theory). Also, I've seen PEG connectors melt when they were pulling that much power for a long time.

Anyway, it was odd to see just how far beyond 150W certain workloads could go -- and yes, I'm sure some GPGPU tasks like F@H and cryptomining and such can come close to FurMark, which is why I don't really consider it a "power virus" -- it's just a strenuous synthetic workload.
 
  • Like
Reactions: bit_user

zodiacfml

Distinguished
Oct 2, 2008
1,228
26
19,310
Inconsistent. Looks like you are using a high-end RX570 , power consumption similar to an RX590. I know it is not impossible. I have RX570s here that has a consumption close to my Vega 56, while some RX580s equals the Vega
 
Inconsistent. Looks like you are using a high-end RX570 , power consumption similar to an RX590. I know it is not impossible. I have RX570s here that has a consumption close to my Vega 56, while some RX580s equals the Vega
I can only test with the cards I have, and that meant an MSI RX 570 Gaming X 4G in this case. What's interesting is that the card is right at the expected 150W TDP (TBP) when running Metro Exodus. The problem is that FurMark blows right through all the power and throttling restrictions. I'm not quite sure why, but I suspect any RX 570 or 580 is likely to exceed TBP. Some might not go as high as 225W, but the other AMD cards tested indicate there's something about FurMark-type workloads that will exceed typical power use.
 

bit_user

Titan
Ambassador
the other AMD cards tested indicate there's something about FurMark-type workloads that will exceed typical power use.
FurMark is infamous for this. I believe it achieves very high shader occupancy and is compute-bound, rather than being memory-bottlenecked.

That still doesn't fully explain why AMD cards do such a poor job at backing off to stay within their power limits.

Bonus:

 
Last edited:
Jul 6, 2020
9
1
15
To my understanding this only tells how much power GPU will consume, when you play a game with certain settings. Which is different job for every gpu, since they produce different amount of details per second (pixels, voxels, etc.) based on alternating fps.

I think the more interesting set would be how many Watt-Hours are needed to finish a certain task, like encoding a video or rendering a 3d object.

Since my eGPU will not do heavy work for 99% of time in use and will spend a lot of time, I'm most interested about GPU power efficiency in idle and light use?

I also don't want extra heat to the room and want to save in elecricity.

How much these GPUs use, when you do nothing. Like showing the desktop on 4k monitor. Or just scroll your few pages on a browser.
 
To my understanding this only tells how much power GPU will consume, when you play a game with certain settings. Which is different job for every gpu, since they produce different amount of details per second (pixels, voxels, etc.) based on alternating fps.

I think the more interesting set would be how many Watt-Hours are needed to finish a certain task, like encoding a video or rendering a 3d object.

Since my eGPU will not do heavy work for 99% of time in use and will spend a lot of time, I'm most interested about GPU power efficiency in idle and light use?

I also don't want extra heat to the room and want to save in elecricity.

How much these GPUs use, when you do nothing. Like showing the desktop on 4k monitor. Or just scroll your few pages on a browser.
This is true, though in practice the difference between most games isn't very large unless a game is specifically limited by something other than the GPU at some settings. I used Metro Exodus, and I noticed that some GPUs (especially lower end models) use more power at 1080p medium than at 1440p ultra -- and while I didn't explicitly state this, I did test at the settings that resulted in higher power use. I also checked a few cards in several different games, and the difference for an RTX 2080 Ti as an example at 1440p medium to 1440p ultra was mostly about 5-10W.

As for idle power, nearly all of the recent (Turing and later, Navi and later) GPUs idle at ~15W, give or take. AMD GPUs do have a "super low power" mode that kicks in when your display is in sleep mode, where the GPU drops to maybe 8W, but for actually using the PC you'll usually be in the 15-25W range.
 
Jul 6, 2020
9
1
15
I'm looking for card to my egpu, which has the smallest idle power use, since 99% of time, I'd be using it only to produce more screen real estate.
Then 1% might be gaming or encoding & rendering.
So I'd like it to use energy as little as possible and then also produce heat as little as possible.

How big difference is there in current AMD lineup?
 
I'm looking for card to my egpu, which has the smallest idle power use, since 99% of time, I'd be using it only to produce more screen real estate.
Then 1% might be gaming or encoding & rendering.
So I'd like it to use energy as little as possible and then also produce heat as little as possible.

How big difference is there in current AMD lineup?
Idle power for most of the latest cards is around 15W, give or take. A few might go as low as 9W (when the display is in sleep mode), some might get close to 20W, but a few watts isn't a major concern I don't think. Once running a graphics workload, you'll get performance and power as shown in these charts. You can reduce power use by reducing performance, basically. Also, for an external GPU box, you probably won't get maximum performance anyway and I wouldn't go about the ~$500 GPU mark. The PCIe x4 link equivalent is going to be a bottleneck.
 
Jul 6, 2020
9
1
15
Idle power for most of the latest cards is around 15W, give or take. A few might go as low as 9W (when the display is in sleep mode), some might get close to 20W, but a few watts isn't a major concern I don't think. Once running a graphics workload, you'll get performance and power as shown in these charts. You can reduce power use by reducing performance, basically. Also, for an external GPU box, you probably won't get maximum performance anyway and I wouldn't go about the ~$500 GPU mark. The PCIe x4 link equivalent is going to be a bottleneck.
Thanks for the info.
How much do you think the power usage will be, when gpu is driving 2x 4k screens with only mostly static graphics (browsers, email, pdfs, office, calendar, etc)?
Will there be any difference between RX 5500 and other cheaper / older cards? Navi 7 drops power usage remarkably in light use?
 
Last edited:
Jul 6, 2020
9
1
15
I guess my choice is now between 5500, 570 and 580.
What gives the best price-performance-power usage...?

I have one old eizo, which would benefit from having dvi, but I could also use that with some other setup...
 
I guess my choice is now between 5500, 570 and 580.
What gives the best price-performance-power usage...?

I have one old eizo, which would benefit from having dvi, but I could also use that with some other setup...
In terms of performance per watt, the 5500 XT wins easily: the 5500 XT 8GB is basically tied with the RX 590 in performance (it's about 2% slower is all) and uses 126W vs. 214W. The RX 580 8GB is about 4% slower than 5500 XT (more if it's a card that's not as overclocked as the Sapphire Nitro+ I used for testing) and has power use of around 208W (less with some 580 models).

Factoring in price, the RX 590 is now selling for $210+ ($195 on eBay), RX 580 8GB is selling for $170 (possibly $130 on eBay), and RX 5500 XT 8GB is selling for $190 (same or higher on eBay). Given pricing is currently pretty close, RX 5500 XT 8GB wins out in my book, but if you can find a cheap RX 580 (meaning, around $140 or less) that would still be a reasonable option.
 
Jul 6, 2020
9
1
15
In terms of performance per watt, the 5500 XT wins easily: the 5500 XT 8GB is basically tied with the RX 590 in performance (it's about 2% slower is all) and uses 126W vs. 214W. The RX 580 8GB is about 4% slower than 5500 XT (more if it's a card that's not as overclocked as the Sapphire Nitro+ I used for testing) and has power use of around 208W (less with some 580 models).

Factoring in price, the RX 590 is now selling for $210+ ($195 on eBay), RX 580 8GB is selling for $170 (possibly $130 on eBay), and RX 5500 XT 8GB is selling for $190 (same or higher on eBay). Given pricing is currently pretty close, RX 5500 XT 8GB wins out in my book, but if you can find a cheap RX 580 (meaning, around $140 or less) that would still be a reasonable option.
Thanks for the advice!
The only factor I have for older cards is that dual-dvi eizo.
I'm not sure, if I could use Apple's dp -> dual-dvi adaptor, that I have, between the dp switch (I'm planning to buy) and the Eizo.
I understand that dvi connectors have nothing to with this thread, but I have (Tim Cook's) courage to ask:
Does a display adapter with 5500 & dvi connector exist?
Or were they wiped out by last gen? Is this a technical decision by AMD?
 
Thanks for the advice!
The only factor I have for older cards is that dual-dvi eizo.
I'm not sure, if I could use Apple's dp -> dual-dvi adaptor, that I have, between the dp switch (I'm planning to buy) and the Eizo.
I understand that dvi connectors have nothing to with this thread, but I have (Tim Cook's) courage to ask:
Does a display adapter with 5500 & dvi connector exist?
Or were they wiped out by last gen? Is this a technical decision by AMD?
I think all the necessary hardware is present on current GPUs to support DVI, but it's not usually done because it's viewed as a dying / dead interface. Or maybe only single-link is supported? Anyway, this PowerColor 5500 XT 8GB has a DVI-D connector, but the specs say it's single-link. There are also multiple GTX 1650/1660 Super cards that have DVI-D connectors, but none explicitly say they're dual-link. Of these, I'd personally go for the Asus 1660 or 1660 Super, unless you insist on using an AMD GPU. But DL DVI-D support seems unlikely on Navi while being almost certain with Turing:

$200 PowerColor 5500 XT 8GB: https://www.newegg.com/powercolor-radeon-rx-5500-xt-axrx-5500xt-8gbd6-dh-oc/p/N82E16814131762
$200 Asus GTX 1660: https://www.newegg.com/asus-geforce-gtx-1660-tuf-gtx1660-o6g-gaming/p/N82E16814126305
$230 Asus GTX 1660 Super: https://www.newegg.com/asus-geforce-gtx-1660-super-tuf-gtx1660s-o6g-gaming/p/N82E16814126359
$170 Gigabyte GTX 1650 Super: https://www.newegg.com/gigabyte-geforce-gtx-1650-super-gv-n165swf2oc-4gd/p/N82E16814932231

What Eizo display are you using? Might be time to sell it and just upgrade to a modern DisplayPort monitor, though I know you're unlikely to get a good value out of going that route. Or you can try one of those 1650 Super or 1660 Super cards, or even the PowerColor RX 5500 and see if the DVI link is actually dual-link.

PowerColor explicitly says "SL DVI-D" but the connector looks like a standard DL DVI-D. (See: https://upload.wikimedia.org/wikipedia/commons/f/fb/DVI_Connector_Types.svg ) I'm 95-ish percent sure the GTX 16-series parts with DVI-D are dual-link as well -- there's at least one place (PCMag) that says the standard configuration for the cards is 1x DL DVI-D, 1x HDMI, and 1x DP.
 
Found this thread:
https://www.techpowerup.com/forums/...e-rx580-power-consumption-30w-at-idle.263656/

Sapphire's 580 idling at 30 watts, other's 10 W...
This is the difference, I'm looking for.

Any charts anywhere for these idle power use...?
My spreadsheets show ~14W 'idle' power use (with the monitor still powered up) on the Sapphire RX 580 8GB Nitro+ LI used for my testing. That's a higher performance 580, so I suspect other 580 cards will potentially use a bit less power (1-2W). But that's only with a single display. Let me see if I can get some quick figures for what happens when two 4K monitors are connected...
 
My spreadsheets show ~14W 'idle' power use (with the monitor still powered up) on the Sapphire RX 580 8GB Nitro+ LI used for my testing. That's a higher performance 580, so I suspect other 580 cards will potentially use a bit less power (1-2W). But that's only with a single display. Let me see if I can get some quick figures for what happens when two 4K monitors are connected...
I just ran some tests, and the results are very interesting -- and again illustrate the problem with 'trusting' whatever power numbers software tells you. This is an image showing the Sapphire RX 580 Nitro+ LE at idle with a single 4K display connected, then with a second 4K display, then launching Horizon Zero Dawn and the built-in benchmark. Then I exit to the desktop, disconnect the second display, and launch HZD again and run the test a second time.
68
First, notice that true idle power is around 14W (14.5W average over the first 75 seconds), but GPU-Z reports idle power of 34W -- that's a 20W delta because the software is getting an incorrect value from the hardware. When I connect the second monitor, however, the power readings are nearly identical (random luck?): Powenetics says 42.6W and GPU-Z says 42.3W average.

Launching Horizon Zero Dawn causes the GPU power to jump a bit, but it's still not doing a full workload -- it's just doing more than running the Windows desktop. Once the benchmark starts, power jumps quite a bit once more.

For the ~three minute benchmark sequence, real (Powenetics) power use for the Nitro+ 580 averages 167W, but GPU-Z reports an average power use of 140W. So now it's a 27W delta the other way. Switching to a single display doesn't really drop gaming power use though: 166.6W for Powenetics, and 140.5W for GPU-Z.
 
Jul 6, 2020
9
1
15
What Eizo display are you using? Might be time to sell it and just upgrade to a modern DisplayPort monitor, though I know you're unlikely to get a good value out of going that route. Or you can try one of those 1650 Super or 1660 Super cards, or even the PowerColor RX 5500 and see if the DVI link is actually dual-link.

PowerColor explicitly says "SL DVI-D" but the connector looks like a standard DL DVI-D. (See: https://upload.wikimedia.org/wikipedia/commons/f/fb/DVI_Connector_Types.svg ) I'm 95-ish percent sure the GTX 16-series parts with DVI-D are dual-link as well -- there's at least one place (PCMag) that says the standard configuration for the cards is 1x DL DVI-D, 1x HDMI, and 1x DP.
My eizo is SX3031 and I don't want to sell it. Price wouldn't be good and there aren't eizos at that price-quality point today, used or new.
I like it a lot, it represents a quality that seems to be gone for the 99% of the market. And it would be sad to dedicate it to old mac mini 2012, just to check in that backups are happening once a month.
If Powercolor's specs say SL, I believe they are. It would also be nice to have 2x dp's for future use... I'm always planning 10 year lifespan for every piece of hardware I buy.
 
  • Like
Reactions: JarredWaltonGPU

ForbiddenEra

Distinguished
Mar 24, 2016
4
0
18,510
Just a heads up to the writer: soldering should be easy. I worry that if you struggled, that your joints may be bad.

A bad solder joint can increase resistance through a connection; this could invalidate your test results (like putting a resistor in the wires)

I assume you were soldering wires by the article; I'm sure there's plenty of videos online but simply twist the wires (see below for a great example of doing a straight connection) and apply heat to the joint, then feed the solder into the joint; you need to heat the joint (not solder) so that the solder can flow onto the joint, there is flux inside the solder (hence flux core) - that flux is needed to strip the oxide layer off the wire so that the solder will actually attach/wick into the wire. Worst case, you can apply some extra flux to the connection yourself first, but once you get the technique down, you should be able to feed in the flux from the flux core solder (and also feed solder into your joint at the same time) .. Also if adding extra flux, it's always a good idea to clean it (iso) as some fluxes can cause corrosion on the joint after if they are left on (it is afterall flux's job to eat the protective oxide layer..)