Second monitor not detected

Brian_182

Commendable
Sep 6, 2016
3
0
1,510
Hello!

So I recently bought a new video card and monitor, a nvidia gtx 1060 and a Philips 240V5QDAB monitor. Before this I had the gtx 760 and an old monitor that only has a VGA port on it. So I connected it with a VGA to DVI-I converter to my GPU and had no problems with it.
When I installed my new GPU and monitor (new monitor connected with HDMI to my GPU) I noticed the gtx 1060 didn't have a DVD-I port but 2 DVI-D ports. So I went ahead and bought a VGA to DVI-D converter and plugged my old monitor into my GPU.

Windows doesn't detect it, it's not under device manager, the nvidia control panel isn't picking it up either and I have tried searching the internet for hours but I can not seem to find a solution to this.

I'm running windows 10 pro

 
Solution
Hello... DVI-D is digital outputs only... https://en.wikipedia.org/wiki/Digital_Visual_Interface
A RAMDAC chip on Video cards would change the Digital to Analog... this will slow down information to your screen... Your information and DATA is digital... so it is best to get/use a monitor that has a Digital input too.
1) Your monitor has HDMI? get a "Passive" DVI-D to HDMI adaptors.
2) Your Video card has HDMI? use a HDMI cable to your Monitor.
3) Use a DVI-D cable to your Monitor.
4) Use the setup screen of your monitor to change the Input/cable used.

It looks like you have both HDMI and DVI-D inputs to your Monitor http://www.philips.com.sg/c-p/240V5QDAB_69/lcd-monitor-with-smartcontrol-lite/specifications
Dump your VGA cables and...
Hello... DVI-D is digital outputs only... https://en.wikipedia.org/wiki/Digital_Visual_Interface
A RAMDAC chip on Video cards would change the Digital to Analog... this will slow down information to your screen... Your information and DATA is digital... so it is best to get/use a monitor that has a Digital input too.
1) Your monitor has HDMI? get a "Passive" DVI-D to HDMI adaptors.
2) Your Video card has HDMI? use a HDMI cable to your Monitor.
3) Use a DVI-D cable to your Monitor.
4) Use the setup screen of your monitor to change the Input/cable used.

It looks like you have both HDMI and DVI-D inputs to your Monitor http://www.philips.com.sg/c-p/240V5QDAB_69/lcd-monitor-with-smartcontrol-lite/specifications
Dump your VGA cables and Analog inputs/outputs for the best performance.

 
Solution
Hi

So thank you for answering so fast! but my Philips monitor is not the problem, it's the second (older) monitor that really isn't working. I am not that smart when it comes to cables so I'm sorry for that. So it would not be possible for me to use a second monitor with a vga to dvi-d thingy?
 
Hello... Modern high end GPU's are not installing a RAMDAC chip on them... (digital to analog convertor)
1) Trade your card in for a RAMDAC chip/DVI-I on it.
2) Use your CPU/MB GPU with your MB VGA connector to the VGA monitor, (enable in your BIO's, save and exit.)
3) Install a second GPU with a VGA output.
4) buy another Non-VGA monitor.

You would need a "ACTIVE" Digital to VGA convertor, not a VGA to DVI-D convertor... in this case.