News AMD RX 7000-series' high idle power draw finally appears to be fixed with the latest 23.12.1 drivers

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
So, guess what! I was wrong, AMD's "Power Problem" isn't actually fixed. I probably had some wrong refresh rates set when doing the testing.

TLDR: 4K 60Hz has much lower power draw. Depending on the GPU, other GPUs can run at 4K 144Hz without spiking the power draw more than ~35%. Retesting is ongoing, but it's messy and now I'm wondering if perhaps using a different DisplayPort connection might make a difference on some cards.

Full details: I'm doing some testing of a recent game launch, and I noticed RX 7800 XT power draw was back to around 30W idle power. I thought it was because of all the GPU swaps, but it's not. Even a full driver cleaning didn't make a difference. What did make a difference: Setting the refresh rate lower. And this is where it gets fun.

RX 7800 XT idle power use drops significantly with a 120Hz or lower refresh rate (again, at 4K). RX 7700 XT behaved in a similar fashion: 120Hz was like 13W, 144Hz was 28W. And then RX 7600 simply didn't like anything more than 4K 60Hz. It drew 6W at 60Hz, but jumped to 15~18W at anything above that level (from 82Hz to 144Hz).

It's not just RX 7000-series GPUs, either. I haven't tested them all, and in fact only have results for the RX 6800 XT at the moment, but at 4K 144Hz, or anything above 60Hz, it draws about 40W idle. At 60Hz, though, it drops to 8~10W power draw.

@jeffy9987 And this would potentially explain what you're seeing. I haven't checked power draw on a higher refresh rate display (I have a 4K 240Hz display on a different PC... except it uses DSC to get there and that may or may not be a factor. Lots of variables, in other words), but if you have a 7900-class card, it's entirely possible that the line is at 144Hz, so your use of 165Hz is "too high" for the lowered power draw.

Which honestly still feels like there's some driver BS that needs to be fixed. I would think all of the current RX 7000-series GPUs should have the same video output hardware, so if the 7900-class cards can have lower idle power use at 144Hz, but the Navi 32 cards need 120Hz or lower, and Navi 33 needs 60Hz... that's weird. Is the GPU compute somehow factoring into how much power is needed? Because the lower spec cards seem to need to kick into a higher power state earlier than the high-spec cards. 🤷‍♂️

Anyway, I'm adding an update to the text and am in progress on testing other workloads (4K video playback, both via VLC and via YouTube). There will be a future article at some point. This is my TED Talk mea culpa.
Multimonitor always uses more power. I could do it, though I suspect idle power for people using two displays (that each use tens of watts) probably isn't as big of a factor.
As I have said above in a previous post, Nvidia is also not immune to these idle power usage woes. Depending on how many and what refresh rate they are, my 3080 seems to have a baseline idle of 34 watts with any one of my 3 monitors at any refresh rate. I highly suspect that some of the cards, particularly ones from AIBs, have different power management schemes in their Card's BIOS's further complicating the issue. For instance, my EVGA FTW3 Ultra 3080 has a BIOS switch. On the OC BIOS I flashed the 500w power BIOS for higher OCing. That BIOS even at 100% power budget idles higher than the other stock "Quiet" BIOS at 100% power budget. I think it also matters whether or not you have the cards fans always spinning or no spin until 50C or w/e the cutoff is. This is because total board power includes the watts used by the fans. Do you have 1, 2, 3, 4, 5 fans on your GPU? Well that would mean a higher TBP if they are spinning.
 
As I have said above in a previous post, Nvidia is also not immune to these idle power usage woes. Depending on how many and what refresh rate they are, my 3080 seems to have a baseline idle of 34 watts with any one of my 3 monitors at any refresh rate. I highly suspect that some of the cards, particularly ones from AIBs, have different power management schemes in their Card's BIOS's further complicating the issue. For instance, my EVGA FTW3 Ultra 3080 has a BIOS switch. On the OC BIOS I flashed the 500w power BIOS for higher OCing. That BIOS even at 100% power budget idles higher than the other stock "Quiet" BIOS at 100% power budget. I think it also matters whether or not you have the cards fans always spinning or no spin until 50C or w/e the cutoff is. This is because total board power includes the watts used by the fans. Do you have 1, 2, 3, 4, 5 fans on your GPU? Well that would mean a higher TBP if they are spinning.
Nvidia most definitely isn't immune, but I suspect multimonitor on all GPUs will end up with more than double the idle power draw — on AMD, Intel, and Nvidia, in other words. I'll have to poke at this more to get hard numbers. Activating a second video output will almost certainly use more power than the fans, though.

Fans at most (meaning, 100% fan speed) tend to consume about 5~8 watts each, depending on the fan. Many are on the lower end of that range (i.e. ~0.4 amps at 12V max), but at 20% RPM even the higher spec fans would be around 1~1.5 watts total. Three fans spinning might thus add 3W total at low RPMs, and a maximum of perhaps 20~25 watts. VRMs and other power regulation measures are almost certainly using way more power than the fans.

My bet is most of the AMD GPUs are going to jump to probably 1/3 of their TBP rating with two monitors connected. I don't have a good way to put two high refresh rate 4K displays next to each other, plus both of them are usually running tests. I don't want to take one PC offline just so I can test multimonitor power use, in other words, at least not right now. But based on what I've already seen, refresh rate and resolution will be a factor in dual monitor situations as well.
 
  • Like
Reactions: helper800
You guessed the 7900 GRE to be at some performance level below or up to exactly 7800 xt's performance. Similar means identical, interchangeable, or same, not some amount above and below the reference.
Oh, so we're debating semantics, now?

I said "similar, ... possibly even a bit worse". I'm precise with my language. If I meant "up to, but no better than", that's what I would've said.

And no, "similar" is not totally synonymous with "the same as". That's why they're different words.

From Wiktionary:

1. similar (comparative more similar, superlative most similar)
  1. Having traits or characteristics in common; alike, comparable.
My new car is similar to my old one, except it has a bit more space in the back.

As for being "possibly a bit worse", that was borne out in NotebookCheck's 4k testing, where half a dozen or so games @ 4k had the 7800 XT surpassing the 7900 GRE. Not the majority of games, but enough that we can clearly see the effect of the GRE having lower memory bandwidth.
 
Last edited:
the Navi 32 cards need 120Hz or lower, and Navi 33 needs 60Hz... that's weird. Is the GPU compute somehow factoring into how much power is needed? Because the lower spec cards seem to need to kick into a higher power state earlier than the high-spec cards. 🤷‍♂️
Another thing that differs between these cards is the memory bandwidth. It'd be really interesting if you could correlate these shifts in power consumption to corresponding changes in GPU core and/or memory clocks.

I'm pretty sure that's what's happening - as soon as utilization of some resource crosses a certain % threshold, the GPU decides to kick into a higher gear (i.e. clock speed).
 
  • Like
Reactions: helper800
Oh, so we're debating semantics, now?

I said "similar, ... possibly even a bit worse". I'm precise with my language. If I meant "up to, but no better than", that's what I would've said.

And no, "similar" is not totally synonymous with "the same as". That's why they're different words.

From Wiktionary:
1. similar (comparative more similar, superlative most similar)​
My new car is similar to my old one, except it has a bit more space in the back.

As for being "possibly a bit worse", that was borne out in NotebookCheck's 4k testing, where half a dozen or so games @ 4k had the 7800 XT surpassing the 7900 GRE. Not the majority of games, but enough that we can clearly see the effect of the GRE having lower memory bandwidth.
Calling it semantics is a cop out. If we do not understand each other because we are using different understandings of the same word, then how can we effectively communicate? In the example, it has to specify that, "My new car is similar to my old one," but then goes on the list their only differences. If two things are similar, but then you have to list the things that make them less than the same, then the way you used the word similar was at least ambiguous. Either way, we are on the same page now. I wish we could speak with something more exacting, like say, math...
 
I'm pretty sure that's what's happening - as soon as utilization of some resource crosses a certain % threshold, the GPU decides to kick into a higher gear (i.e. clock speed).
I believe it is exactly this! It takes a certain amount of processing to upkeep a certain amount of pixels at a specific rate of refresh. When load increases or decreases between the incremental change allotted to the GPU, say 15 mghz jumps, it notches the clock up or down to correspond to an increase or decrease to the load placed upon the core. Since it takes more power to maintain a higher clock, a corresponding jump in clock is a jump in power. The exact specifics of what part of the core handles what in regards to resolution and refresh rate load, and requires what core frequency and why, is beyond me.
 
  • Like
Reactions: bit_user
Calling it semantics is a cop out.
Not when you're trying to language-lawyer what "similar" means.

And failing, I might add. "Similar" does not mean "the same", no matter how hard you try to argue that it does.

Either way, we are on the same page now.
If you want to put this behind us, then stop accusing me of malfeasance. I said what I said. I think it was justified by what I knew at the time. I explained what I meant by it, which I think was entirely reasonable. If you can't accept that, it's your problem.

Moreover, I would point out that I'm the one who went out and searched for good benchmarks, in order to put our claims to the test (granted, Stryker got there before I finished updating my post, but the NotebookCheck benchmarks I found are much more comprehensive).

Data always beats speculation! I try to make it a rule never to spend more time speculating about something than it would take to lookup or measure.
 
Last edited:
  • Like
Reactions: helper800
Not when you're trying to language-lawyer what "similar" means.

And failing, I might add. "Similar" does not mean "the same", no matter how hard you try to argue that it does.


If you want to put this behind us, then stop accusing me of malfeasance. I said what I said. I think it was justified by what I knew at the time. I explained what I meant by it, which I think was entirely reasonable. If you can't accept that, it's your problem.

Moreover, I would point out that I'm the one who went out and searched for good benchmarks, in order to put our claims to the test (granted, Stryker got there before I finished updating my post, but the NotebookCheck benchmarks I found are much more comprehensive).

Data always beats speculation! I try to make it a rule never to spend more time speculating about something than it would take to lookup or measure.
I am not accusing you of anything other than you speculated one thing, and I another. Our speculation did not align. We were both wrong in the end. I thought you meant something you did not. I have no problem. Of course useful benchmarks beat speculation. I though your initial speculation was more of a lowball regardless of semantics is all. Whether or not that is factual is irrelevant because we were both speculating and that is what motivated my response at the time.
 
I don't know how you have measured this, but the problem is not solved but actually worse for 7900XT.
I have 40-45 W in idle using 4k at 60Hz with the 2023.12.1 drivers. With the November drivers the power was actually lower, around 35W.
 
I don't know how you have measured this, but the problem is not solved but actually worse for 7900XT.
I have 40-45 W in idle using 4k at 60Hz with the 2023.12.1 drivers. With the November drivers the power was actually lower, around 35W.
As noted in the update, it's still an issue, but refresh rates and monitor choice do appear to be factors. On my test PC, I consistently get <20W idle on the Navi 31 GPUs at 4K 144Hz. However, any moderate activity on the screen will cause power use to jump up to the 40~50 watts range.
 
As noted in the update, it's still an issue, but refresh rates and monitor choice do appear to be factors. On my test PC, I consistently get <20W idle on the Navi 31 GPUs at 4K 144Hz. However, any moderate activity on the screen will cause power use to jump up to the 40~50 watts range.
Actually it seems that I have made a mistake. The problem was that I had HDR turned on. It seems that HDR and Freesync have a big impact on power consumption.

Without HDR, the power consumption for 7900xt is around 17W and with HDR around 40W which is pretty good.

I have compared it with a friend that has nvidia 3080TI and his power consumption is similar. Nvidia has 25-26W without HDR and 41W with HDR.
With 2 monitors at 144Hz and HDR on, the 3080ti has 109W in idle on his setup.

So I take back what I mentioned earlier. It was a testing error.
 
Status
Not open for further replies.