Question Hooking up a TVPC, Integrated Graphics vs External Graphics

Imacflier

Distinguished
Jan 19, 2014
467
8
18,865
G'Morning, All

I am building a TVPC based on an I7-9700t and an RTX 3050 8GB. The RTX 3050 will be interfaced m.2 to PCIE. The TV has multiple remotely switched HDMI inputs.

I plan on connecting the TV via one HDMI input to the RTX 3050 and using another HDMI input to connect to the I-9700t integrated graphics,

Will the result be that:

when the HDMI input connected to the RTX 3050 is selected, the TV will display RTX 3050 video, and,

when the HDMI input connected to the I-9700t integrated graphics is selected the TV will display integrated graphics?

This looks pretty straight forward to me. Is it correct?

Please Advise, and,

TIA

Larry
 
This looks pretty straight forward to me. Is it correct?
To an extent.

When dedicated GPU is in the system, iGPU will be automatically disabled. Now, for laptops, you could enable iGPU and use both alongside each other. But for desktops, it depends on the MoBo BIOS features.

The RTX 3050 will be interfaced m.2 to PCIE.
What kind of ATX MoBo doesn't have PCI-E x16 slot? Whereby you are using PCI-E to M.2 adapter to hook RTX 3050 into the system, while loosing half the PCI-E lanes that GPU uses.

RTX 3050 is PCI-E 4.0 x8 GPU. Common M.2 slot is PCI-E x4. But there are M.2 slots out there that are x2 or even x1 (dual-lane or single-lane PCI-E).

Sounds like you're using laptop MoBo as a base. That would explain the notion of using RTX 3050 as external GPU via M.2 slot and why your CPU is low TDP i7-9700T.

Still, RTX 3050 is 130W GPU and from where to you get the supplementary power it needs via PCI-E power connector?
M.2 slot can provide 15W worth of power. Best case scenario, up to 30W. Nowhere near what GPU needs. PCI-E x16 slot can provide up to 75W worth of power. Still way too less for the GPU.
 
Aeacus,

Thank you for the reply. I am using an HP Elite Desk 800 G5 Mini, thus no provision for a GPU, other than over m.2. Power will also be external.

I had been under the impression that only when a monitor was 'seen' would a GPU, either internal or external, be active.

Larry
 
I don't think it works that way. Either the GPU will be powered up and ready to output, or it won't be. Now power saving is going to happen if they are active, but that isn't the same as them being off.

My question would be why? If your goal is a HTPC I'm sure the iGPU is sufficient for that. If you want to switch back and forth for gaming, it would make more sense to let the RTX 3050 idle as it will and have the iGPU regularly disabled.

Now when I was using a chip with integrated graphics, I went ahead and installed a GPU for decoding purposes anyway and just used that. My current HTPC doesn't have integrated graphics, so I use a dedicated A380 (though that purchase was a bit of a whim, it was a GT1030 for quite some time)
 
To an extent.

When dedicated GPU is in the system, iGPU will be automatically disabled. Now, for laptops, you could enable iGPU and use both alongside each other. But for desktops, it depends on the MoBo BIOS features.
No, I've heard this for AMD so it might happen there, but on intel the igpu stays active, the dGPU will just be assigned as the main GPU even when both are connected at the same time during installation, or any other time.
This is the setting you can change in bios, which one to activate first (and thus will be the main) .
Will the result be that:

when the HDMI input connected to the RTX 3050 is selected, the TV will display RTX 3050 video, and,

when the HDMI input connected to the I-9700t integrated graphics is selected the TV will display integrated graphics?

This looks pretty straight forward to me. Is it correct?
No, you assign a main GPU in windows settings (actually you select the main display) and every app/game that can choose GPUs will use whichever GPU is connected to that display.
With your display being the same one with both GPUs you have to go into windows graphics settings and set up every single app/game individually to whatever you like it to use, well only those that don't run on the one you want them to.
windows-11-graphics-preferences.png
 
Choosing which HDMI is connected to the display is no differed than physically connecting between one of two external displays, is it not?
You are correct there is no difference, in both cases windows, or the apps/games, do not care in the slightest.....

You can do the test yourself, play a video or a game and disconnect the hdmi for a few seconds, when you reconnect it it will be as if nothing has happened, there is no stopping of the video/game or anything.
 
<sigh>, you are correct, as are most of the other responders. It only really makes sense to simply use the RTX 3050 as the sole video handling GPU (at least once I discovered the fan will shut off at idle!)

Thank you all, I learned quite a bit from this thread!

My apologies for 'wasting" your time.

Thanks again, and warm regards,

Larry
 
  • Like
Reactions: USAFRet