Question No Signal Detected

Jul 20, 2019
1
0
10
0
Hi all,
I have two monitors, one is new.
Here is what happened chronologically.
-Got a new monitor, plugged it in and...... it worked totally fine with my old monitor, dual screen was amazing. Assigned new monitor as primary. It was plugged in via an HDMI cable.
-Got a display port cord, and changed the HDMI cord out for it. And...... everything continued to work as it was.
-Weeks pass. Then, I used my old monitor to plug into a different computer that is a bit old. Monitor worked fine on old. Then, plugged back into new. Now, the old monitor is no longer detected by my new computer

I have followed all the known ways to try and solve this problem and here is what I found.
When I disable the Nvidia driver, the computer will boot up and be visible on the old monitor, but will not be able to change to real resolution.
New monitor can find it's real resolution even with generic windows display driver. But, if I plug in my new monitor, the old one signals out and then can't be detected. Same thing happens if I re-enable the Nvidia driver with just the old monitor attached.
Safe mode has same results.
Nothing happens when I rollback Nvidia driver and try again. New monitor just wont get detected.
Same when I try to re-install Generic Non PnP monitor driver through windows.
- Additionally, I look at the old monitor's events when it does get signal in when the aforementioned Nvidia driver is disabled. I get the following event.
"Device DISPLAY\Default_Monitor\1&8713bca&0&UID0 was not migrated due to partial or ambiguous match."

Currently, just getting on with the one working monitor.
I am not looking for a silver bullet, as the monitor is 10 years+ old, so it is okay.
Mainly I just want to confirm that I am not fooling myself with thinking the old monitor somehow got adjusted to the old computer and that is why it won't work for the new one again?

Thank you for all the help.
 

ASK THE COMMUNITY

TRENDING THREADS