[SOLVED] Question about switching displays to 2 monitors

rage690

Distinguished
Mar 22, 2013
53
2
18,535
We have two rooms.

Room A ( is where my DESKTOP PC with HDMI monitor is situated )
Room B ( is where i sometimes stay to keep my autistic nephew company )

i wanted to be able to use my computer from Room B and i drilled a whole in the wall to pass the connections.
but i realized its not a place where i am comfortable using my computer, there are times where i need the peace and quiet. so i still prefer it mainly in Room A.
but i still want to be able to use it in Room B and i have an EXTRA VGA MONITOR.

i thought about HDMI switches but then i need the HDMI switch along with a HDMI to VGA converter because my EXTRA MONITOR is VGA.
my video card however is an ASUS TUF GTX 1660 Super X3 with i think VGA, DISPLAY PORT, and HDMI output connections.

i was thinking about leaving the HDMI port connected to my main HDMI monitor then connect my VGA monitor to the VGA connection in the video card ? is that possible ?

if it is possible, will the GPU use double the resources since its outputting to an HDMI(1920x1080) and VGA (1366x768) monitors ?
if this is the case will it use a single resource only if i switch one monitor OFF ? or does the GPU still work twice ?

thank you guys
 
Solution
oops sorry it was a DVI-D ... i forgot i had one of those DVI to VGA adapter that i used with it.



so does that mean i can,
leave the HDMI connection connected from the GPU to the HDMI monitor,
then leave the DVI connection (with DVI to VGA adapter) connected from the GPU to the VGA monitor,
and i can just switch OFF the HDMI monitor when i want to use the VGA monitor ?
then switch OFF the VGA monitor if i want to use the HDMI monitor ?
and the GPU will not double its resources when both physical connections are connected but one monitor is OFF ?

thank you

You're welcome.

Yes if that adapter works now, as it used to work with the VGA monitor, the whole setup would be OK to use the way you described.

When the other display...

Satan-IR

Splendid
Ambassador
I don't think that a 1660 would have VGA port/output. Maybe a DVI-D?

Also Can't use both at same ime usually or at best the displays would be mirrored.

I'd say get a DVI-D to VGA active adapter/converter and use HDMI for the HDMI capable display/monitor.
 

rage690

Distinguished
Mar 22, 2013
53
2
18,535
I don't think that a 1660 would have VGA port/output. Maybe a DVI-D?

oops sorry it was a DVI-D ... i forgot i had one of those DVI to VGA adapter that i used with it.

I'd say get a DVI-D to VGA active adapter/converter and use HDMI for the HDMI capable display/monitor.

so does that mean i can,
leave the HDMI connection connected from the GPU to the HDMI monitor,
then leave the DVI connection (with DVI to VGA adapter) connected from the GPU to the VGA monitor,
and i can just switch OFF the HDMI monitor when i want to use the VGA monitor ?
then switch OFF the VGA monitor if i want to use the HDMI monitor ?
and the GPU will not double its resources when both physical connections are connected but one monitor is OFF ?

thank you
 

Satan-IR

Splendid
Ambassador
oops sorry it was a DVI-D ... i forgot i had one of those DVI to VGA adapter that i used with it.



so does that mean i can,
leave the HDMI connection connected from the GPU to the HDMI monitor,
then leave the DVI connection (with DVI to VGA adapter) connected from the GPU to the VGA monitor,
and i can just switch OFF the HDMI monitor when i want to use the VGA monitor ?
then switch OFF the VGA monitor if i want to use the HDMI monitor ?
and the GPU will not double its resources when both physical connections are connected but one monitor is OFF ?

thank you

You're welcome.

Yes if that adapter works now, as it used to work with the VGA monitor, the whole setup would be OK to use the way you described.

When the other display is off and one is working through the relevant port the other one is just physically connected and sitting there and not using any resources.

Each monitor would cummunicate with the graphics card (through I2C and/or EDID) when used and it would describe its capabilities to the video source (graphics card here) and they'd work properly.
 
Solution