Crimonit

Prominent
Sep 23, 2022
3
0
510
Specs:
Palit GTX 1650 super
Displays:
Samsung u24e590ds (running at 3840x2160, DP 1.2a)
Dell E2420H (running at 1920x1080, vga to hdmi)
LG L1942S (running 1280x1024, vga to dvi-d)

Ok, so i just got a 4k display. hooked it up and it works fine. I just wanted to plug in another display to my system, and seeing that it has only vga port, I got a VGA to DVI-D adapter. Plugging a monitor with it results in no image on all displays, and unplugging it back does not make image appear again. Then, seconds later, the graphics card's fan starts spinning super loud.

However, if i launch Windows with a DVI-connected display on it's own, it works fine.

So basically, plugging a display through DVI does weird things to my gpu, can I somehow make all 3 displays work at the same time? Could that be a driver issue? Any help appreciated

Edit: found out that if I plug in DVI only, nvidia control panel detects it as DVI, however, when I plug in DP and DVI, control panel says that a monitor connected through DVI is HDMI.
 
Last edited:

Crimonit

Prominent
Sep 23, 2022
3
0
510
which adapter dvi-d to VGA are you using?
it has to be an active one, because the GTX1650 doesn´t support analogue signals

try lowering the resolutions and have a look it it can display on all monitors after that
Can't seem to find any info on whether my adapter is active or not, it comes with a square box. As for lowering resolutions, tried lowering resolutions to minimum before plugging in dvi, no result. All screens go black until I reset PC without DVI.