[SOLVED] 144hz monitor will only display 120hz

Apr 17, 2020
1
0
10
Yesterday, I bought a monitor (Viotek GN32DB) and it's 2560x1440 144hz. I have it connected to my GTX 760 through a display port cable to ensure 144 Hz, but when I go into the display settings and try to set it to 144 Hz, it doesn't show up (with the 2560x1440 resolution). The highest it went was 120 Hz. However, if I switch the resolution to 1920x1080, I am able to select an option for 144hz. Can anyone help me figure out why it's doing this? All my graphics drivers are up to date as well. Might I add, the person I bought it from had it working with the higher resolution plus the 144hz.
 
Solution
I guess it could be a bad cable? You could try creating a custom resolution in the Nvidia control panel. Sometimes the scale % in windows settings can cause weird things to happen when not set to 100%, but that's just a shot in the dark. Troubleshooting often is best served by replacing pieces to confirm the functionality of everything downstream. While you perhaps don't have much in terms of alternative hardware, booting to a USB thumb drive with Linux could narrow down the search to either before or after the OS.
I guess it could be a bad cable? You could try creating a custom resolution in the Nvidia control panel. Sometimes the scale % in windows settings can cause weird things to happen when not set to 100%, but that's just a shot in the dark. Troubleshooting often is best served by replacing pieces to confirm the functionality of everything downstream. While you perhaps don't have much in terms of alternative hardware, booting to a USB thumb drive with Linux could narrow down the search to either before or after the OS.
 
Solution