Question 7800X3D iGPU for Win, 7900 XTX for Games?

cy-one

Commendable
Jan 23, 2021
6
0
1,510
Hello,
this question is not really a new one, but I'm having issues finding satisfying answers (including "nope, not possible).

The situation is the following:
  • 7900 XTX via DP to monitor (3-DP)
  • 7800 X3D via Mainboard-HDMI to monitor (2-HDMI) (monitor, if left alone, will prioritize 2-HDMI over 3-DP)
I'd like to set everything up in such a way that, as long as I am not gaming, the computer is running of the 7800 X3D's iGPU to save power and heat. The 7900 XTX is a hungry and hot beast, but using it to watch Youtube is just... so inefficient. While I could hook up a laptop to the monitor's 1-HDMI to save power, I'd rather use my actual primary PC for most of my stuff.
I've googled a bit, but haven't really found a working solution.

Currently, with the above cabling, telling Win to "only show on 2" (7800 X3D iGPU) in the display settings does... little. I mean, yeah, only the monitor's 2-HDMI input gets a signal, but the PC's power draw (measured at socket), the 7900 XTX's total board power (Adrenaline) and the 7900 XTX's usage (task manager) don't really change positively (as in: less power usage). It seems like the dGPU is still being used "as normal", just that it's output isn't used.

As I can (and have) connect(ed) both the board's HDMI and the card's DP to the monitor, I absolutely do not mind switching inputs to make this work.

How do I do this?
 
Nope, not possible.

Whatever monitor is connected to your graphics card is going to use that graphics card. If you connect a monitor to the iGPU output on your motherboard, then it is going to use THAT.

So, to do what you want to do you would almost certainly need to use one display for just basic desktop stuff like Youtube, and connect that to the iGPU output on your motherboard, and another (Or multiple) displays connected to your graphics card, and when gaming make sure you are displaying the game information on that monitor(s).

There is no other way that I know of to force the system to only use the graphics card for gaming and not for anything else if all displays are connected to the graphics card.
 
its a physical connector, and PC can't reroute signals through the motherboard from the graphics card. GPU aren't designed that way either.

laptops can sort of do it only as they designed to use either CPU at desktop or GPU in games. Probably helps in them the GPU is built onto the motherboard itself.
Desktops not designed that way.

I thought they fixed the 7900XTX multiple monitor power usage bug so it shouldn't be that much higher than igpu now.
 
Thank you for the response, but I get the feeling both of you did not catch one detail: the signal of the iGPU is not _expected_ to run through the dGPU.
Both GPU have their own output (iGPU via the motherboard's HDMI, dGPU via the GPU's DP).
As far as Windows is concerned, I have two identical monitors - one hooked up to the mainboard's HDMI, one to the GPU's DP.

When I tell Windows to only use "Monitor 2" (which is the iGPU/HDMI input of the monitor), the monitor correctly does not get any signal via dGPU/DP. However, the GPU itself works _(nearly) just as much,_ pulls just as much current and is just as warm_ when doing this as when running everything on "Monitor 1" (dGPU/DP).

Even when I _unplug_ the DP-cable from the dGPU, the dGPU still does it's thing additionally to the iGPU.
While writing this post, I've watched YT (both with plugged dGPU/DP and unplugged dGPU/DP). If the video _doesn't run_, the dGPU slowly goes down to ~17W total board power in Adrenalin. When I unpause video playback, the dGPU goes back up to 30-40W tbp.
This value (now 30-50W) is similar to telling Windows to use "Monitor 1" (dGPU/DP) and continuing to watch there. When pausing the video now, the dGPU goes down as well (not to 17W, but 26W tbp in this case).

What I gather from this is that it matters little if I run everything from the mainboard's HDMI and tell Windows to only use the monitor on HDMI, or if I run of the dGPU via DP.
 
I completely understand what you were saying. And unfortunately, so long as the card is installed it is going to draw power. Period. What you want to do in the way you want to do it is simply not possible.

Personally, I'd recommend simply disabling the iGPU and just running both monitors off the graphics card if you're going to have it installed anyhow. Then disable the iGPU in the BIOS as well. You'll probably reduce power draw by more than the graphics card uses for non-gaming situations anyhow.
 
Supposedly you can tell Windows which GPU to use to render which app if you go to Settings -> Display -> Graphics. I've only tested this enough to see this option, but haven't really played around with it enough to see if it actually does what it says. Otherwise you could get a DP/HDMI switch. However, the video card isn't going to mostly or completely turn off.
 
However, the video card isn't going to mostly or completely turn off.
Which is the whole point. Yes, you can direct Windows to use specific adapters for specific tasks, and customize a lot of other things as well, but as long as the graphics card is plugged into the PCIe slot and has it's auxiliary power connected, if it requires auxiliary power (6 pin, 8 pin, etc.), it's going to behave the same. It WILL use power, even if it's not being used to display anything meaningful, or at all, at the time. Which is clear from the fact that even if you unplug the display cable you can still see measurable power usage from that device as the OP stated. So, it fixes nothing in terms of what was wanted, but, you too are correct.
 
Which is the whole point. Yes, you can direct Windows to use specific adapters for specific tasks, and customize a lot of other things as well, but as long as the graphics card is plugged into the PCIe slot and has it's auxiliary power connected, if it requires auxiliary power (6 pin, 8 pin, etc.), it's going to behave the same. It WILL use power, even if it's not being used to display anything meaningful, or at all, at the time. Which is clear from the fact that even if you unplug the display cable you can still see measurable power usage from that device as the OP stated. So, it fixes nothing in terms of what was wanted, but, you too are correct.
Looking at OP's concerns as well, it looks like they're talking about an issue with the RX 7000 where at launch they consumed a stupid amount of power for doing multimonitor or video playback. AMD allegedly addressed this issue in a December driver update, but I haven't seen anyone actually confirm this, at least in my 5 minutes I'm willing to spend searching on it.

But 30-50W OP is reporting sounds like either the drivers aren't updated or AMD didn't really fix the issue, considering the GeForce 40 cards can sit below 20W with multiple monitors and video.
 
While I was aware of the high power draw at launch, I actually didn't even have that on my mind when writing my OP.

But yes. Right now I have a few background processes (like Steam, GOG, Razer Synapse, OneDrive and SearchEverything) running. Additionally, Spotify is streaming, BambuLabs (a 3D Printing Slicer) is showing a stream of my printer printing ...
I will now pause Spotify and open Adrenalin to check where the wattage sits once I've started a YT video.

FNHnON0.png


Drivers are up to date, monitor is a 4K screen
GPU 0 varies between 0% and 5%
GPU 1 varies between 5% and 15%
TBP varies between 45W and 60W
 
While I was aware of the high power draw at launch, I actually didn't even have that on my mind when writing my OP.

But yes. Right now I have a few background processes (like Steam, GOG, Razer Synapse, OneDrive and SearchEverything) running. Additionally, Spotify is streaming, BambuLabs (a 3D Printing Slicer) is showing a stream of my printer printing ...
I will now pause Spotify and open Adrenalin to check where the wattage sits once I've started a YT video.

FNHnON0.png


Drivers are up to date, monitor is a 4K screen
GPU 0 varies between 0% and 5%
GPU 1 varies between 5% and 15%
TBP varies between 45W and 60W
Using only one monitor plugged in, does it even get down to around <15W idling?
 
Using only one monitor plugged in, does it even get down to around <15W idling?
Not really. I mean... if I plug in the monitor to both mainboard-iGPU and tell windows to use that monitor exclusively (not expanding onto both monitors, but only using the one connected to the iGPU), then... kinda. The lowest in that was 17W.
If the GPU is actually used for output, the lowest I got was 26W on just windows desktop.

Bit of a correction: I've seen it go down to 18W under idle-but-GPU-use.
However, it does stay mostly between 20W and 25W
 
Not really. I mean... if I plug in the monitor to both mainboard-iGPU and tell windows to use that monitor exclusively (not expanding onto both monitors, but only using the one connected to the iGPU), then... kinda. The lowest in that was 17W.
If the GPU is actually used for output, the lowest I got was 26W on just windows desktop.

Bit of a correction: I've seen it go down to 18W under idle-but-GPU-use.
However, it does stay mostly between 20W and 25W
Also it slipped my mind but what monitor are you using?
 
Try seeing if lowering the resolution or using a lower resolution monitor lowers power consumption.

This isn't trying to push a solution, but trying to understand the behavior of the hardware. Because this just may be how AMD GPUs work. I don't have one to test so I can't do this myself.