Can't use dual monitors with GTX 1060

Sep 4, 2018
4
0
10
I just upgraded my 760 to a 1060 and I'm having the hardest time trying to get both monitors to work at the same time.

On the 760 I had both monitors connected using DVI. However, since the 1060 only has one DVI port I have to switch my main display to HDMI (it's technically a Samsung TV, the second screen is just an Acer computer monitor)

On the 1060...
- when I plug in just the Samsung TV through HDMI, the single screen display works.
- when I plug in just the Acer monitor through DVI, the single screen display works
- when I try to plug them both in at once, only one of them shows up

Both Windows screen resolution controls and NVIDIA Control Panel only detect one monitor at a time (but when just a single one is plugged in, they work just fine). I even tried the "Rigorous Display Detection" in the Multi-Display section of NVIDIA Control Panel and it won't detect the other monitor.

Any ideas what could be causing this issue? Thanks!

I'm on 64-bit Windows 7. Just installed the latest NVIDIA driver (399.07).
(Also the text on my HDMI monitor looks very pixelated and poor quality. Much worse than it did through DVI on my 760).

EDIT: Dual monitors seem to work when I use 2 HDMI ports. Any combination of HDMI and DVI doesn't work though. Problem is, my second screen (the Acer monitor) only has DVI. The 1060 also has 2 DisplayPort inputs. If I were to buy a DVI-to-DisplayPort adapter, would that alleviate the issue? In that scenario, I'd connect Samsung TV as Display 1 using DVI, and Acer monitor as Display 2 using DVI-to-DisplayPort. Does that sound like it would work? Or is there some kind of analog/digital issue that I'm not fully grasping here?
 
Solution
There is no such thing as high quality digital cables. You may see marketing but it's nonsense. The quality is unaffected by the quality of the cable. That's what causes the main advantage of digital is because the signal is 1s and 0s. You never said it was ghosting. This could imply the monitor changes settings depending on the input.

Thanks for the suggestion. Where would I go to find this auto detect option? Is it in the NVIDIA Control Panel? Or is it a Windows setting?
 
The tv and monitor show up when they are single so that means they are already detecting the right input and it's not the issue. Try ddu and reinstall the driver. Also try an older driver if that doesn't work.

Make sure hdmi is set to full rgb in ncp.
 


Sadly I already tried ddu and a fresh install of the driver but no luck. It also appears that HDMI is already set to RGB.

I did a bit of additional troubleshooting with an old Philips TV of mine. I discovered that if I plug that one in as a second monitor through HDMI with my Samsung set as the main display (also using HDMI), I can get 2 monitors to appear. However, any combination of the Acer, Samsung, and Philips that involves one HDMI and one DVI doesn't work. So it seems to be some kind of compatibility issue.

Based on a few quick searches, I'm beginning to suspect some sort of analog vs digital issue. My DVI cord as well as the slot on the 1060 are both DVI-D though (and not DVI-I). Based on my (very limited) knowledge of DVI, I thought that meant that it was digital and thus shouldn't be causing a conflict. Am I mistaken?

In addition to the 2 HDMI ports and 1 DVI port, the 1060 also has 2 DisplayPort inputs as well. If I were to purchase a DVI to DisplayPort adapter, would that likely solve my problem? In that scenario, would I be able to use one DVI-to-DisplayPort input and one DVI input? I'd like to avoid HDMI if possible as it looks pretty awful compared to the crisp picture I'm getting with DVI.
 
There is no simple rgb. It's either limited or full. Hdmi defaults to limited which causes color issues. It's all digital. No more analog. Hdmi should look the same unless settings aren't correct. That's the thing about digital; they are the same.

Some people have been reporting issues with using dvi along with other ports. It seems to be driver related but it also only happens on certain models. I have a msi 1060 and it doesn't have the issue. Using a dp adapter could work if you don't mind spending the money. You'd want dp to dvi, not dvi to dp. They only work one way.
 


Ahh I see what you mean about the RGB. Yeah, I had it set to Limited. Even after switching it to Full however, the ghosting artifacts still seem to be present. It just doesn't look as crisp as my DVI cords for some reason.I'm sure different HDMI cords produce different quality of results, but I'm fairly certain the two cords that I tried were originally packaged with my Nintendo Switch and PS4 Pro, so I'd assume they'd be fairly good quality.

The 1060 I bought is Asus, the 3 GB model. Hopefully that one doesn't have issues with DVI and DP. At this point, it seems like the DP adapter is my only option outside of buying a new monitor or giving up and going back to my 760. Kind of a bummer that I had better results with a card from 2013 than one that released just 2 years ago.

I will most likely shell out the cash for an adapter and report my results. Thanks for all of your help k1114!
 
There is no such thing as high quality digital cables. You may see marketing but it's nonsense. The quality is unaffected by the quality of the cable. That's what causes the main advantage of digital is because the signal is 1s and 0s. You never said it was ghosting. This could imply the monitor changes settings depending on the input.
 
Solution