Monitor out of range

Jul 12, 2018
23
0
20
1
I have a problem with my monitor. Yesterday I wanted to watch movies on my 4k TV and I unplugged VGA cable from monitor and plugged it into TV. When I finished watching movies I plugged VGA cable back into monitor but it shows only "Out of range". I tried restarting PC couple of times and still it wouldn't work. I connected PC to TV and I changed resolution to the native resolution of my monitor and set fresh rate to 60hz, and it still wouldn't work. To while connected to TV I put PC on safe boot and then when I connect it bank live back to monitor it would work normally. But soon as I would turn off safe boot, or change resolution it would say ''Out of range''.
I also noticed that in nVidia control panel is written that current display is uhd LCD TV. How can I change it back to normal?
My specs are:
PNY GeForce GTX 960
i5-4690k
8gb ddr3
 
Jul 12, 2018
23
0
20
1
I fixed it. I turned on and shut down PC three times, opened troubleshooting, startup options and turned on low resolution mode.
Anyway thanks for your help guys.
 
Shut the system down using Windows shut down, and then unplug for 30 seconds.

Unplug the monitor power cord and the monitor video cable for a few minutes. Then reconnect the monitor. Start up the PC. Then go to Windows Display to identify the monitor and set the resolution.
 

LinuxDevice

Respectable
May 20, 2017
351
5
1,965
59
Is this actually an old analog 15-pin VGA? If so, then you must manually set details of the monitor. Unlike HDMI or DisplayPort, VGA is not capable of telling the video card what its valid settings are.
 

At the top of the Windows Display screen is the Identify section. Use the Detect button to detect any connected monitor. Then use Identify to assign a label to the monitor within Windows.

Also I would try a different adapter for VGA. HDMI to VGA or DisplayPort to VGA are a better choice. Not all DVI ports can be used for a VGA adapter.
 


Go to the Windows Settings, that will open the Windows settings home. Then click on Display (under System).

At the top of Windows Display, is "Select and rearrange displays". "Identify " and "Detect" are part of that feature.
 

LinuxDevice

Respectable
May 20, 2017
351
5
1,965
59


It isn't possible for configuration to work automatically with VGA. You've literally cut the wire which provides the video card the ability to ask the monitor what it can do. This kind of display requires manually forcing a mode...modern Windows might not like this.

Something which might help: HDMI is hot-plug. If you plug in a valid HDMI monitor, then when you unplug it and connect a VGA monitor, it should remain in the same mode the HDMI monitor used (at least until another HDMI hot plug event or reboot).
 
Jul 12, 2018
23
0
20
1
I fixed it. I turned on and shut down PC three times, opened troubleshooting, startup options and turned on low resolution mode.
Anyway thanks for your help guys.
 

ASK THE COMMUNITY