Specs:
Palit GTX 1650 super
Displays:
Samsung u24e590ds (running at 3840x2160, DP 1.2a)
Dell E2420H (running at 1920x1080, vga to hdmi)
LG L1942S (running 1280x1024, vga to dvi-d)
Ok, so i just got a 4k display. hooked it up and it works fine. I just wanted to plug in another display to my system, and seeing that it has only vga port, I got a VGA to DVI-D adapter. Plugging a monitor with it results in no image on all displays, and unplugging it back does not make image appear again. Then, seconds later, the graphics card's fan starts spinning super loud.
However, if i launch Windows with a DVI-connected display on it's own, it works fine.
So basically, plugging a display through DVI does weird things to my gpu, can I somehow make all 3 displays work at the same time? Could that be a driver issue? Any help appreciated
Edit: found out that if I plug in DVI only, nvidia control panel detects it as DVI, however, when I plug in DP and DVI, control panel says that a monitor connected through DVI is HDMI.
Palit GTX 1650 super
Displays:
Samsung u24e590ds (running at 3840x2160, DP 1.2a)
Dell E2420H (running at 1920x1080, vga to hdmi)
LG L1942S (running 1280x1024, vga to dvi-d)
Ok, so i just got a 4k display. hooked it up and it works fine. I just wanted to plug in another display to my system, and seeing that it has only vga port, I got a VGA to DVI-D adapter. Plugging a monitor with it results in no image on all displays, and unplugging it back does not make image appear again. Then, seconds later, the graphics card's fan starts spinning super loud.
However, if i launch Windows with a DVI-connected display on it's own, it works fine.
So basically, plugging a display through DVI does weird things to my gpu, can I somehow make all 3 displays work at the same time? Could that be a driver issue? Any help appreciated
Edit: found out that if I plug in DVI only, nvidia control panel detects it as DVI, however, when I plug in DP and DVI, control panel says that a monitor connected through DVI is HDMI.
Last edited: