Question Can't display 1920x1080 resolution on a 1366x768 anymore.

Bogdan7585

Reputable
Feb 9, 2016
65
2
4,635
Hello. I have a 1366x768 resolution monitor. Earlier today I tried to put resolution at 1920x1080 in the Nvidia Control Panel, but I got a black screen and a message on the monitor saying it won't display resolution above 1366x768. The weird thing is that I was able to put a resolution higher than native one earlier, including 1920x1080. Please help.
 
Hello. I have a 1366x768 resolution monitor. Earlier today I tried to put resolution at 1920x1080 in the Nvidia Control Panel, but I got a black screen and a message on the monitor saying it won't display resolution above 1366x768. The weird thing is that I was able to put a resolution higher than native one earlier, including 1920x1080. Please help.
Why would you do that? You gain 0 improvements in image quality and you decrease your performance by close to half. Its called virtual super resolution with AMD cards.
 

Bogdan7585

Reputable
Feb 9, 2016
65
2
4,635
Why would you do that? You gain 0 improvements in image quality and you decrease your performance by close to half. Its called virtual super resolution with AMD cards.
I downloaded a certain PC testing program, and it requires higher resolution to run some 3D GPU test. There is no other reason. I never used higher resolution for other causes because it looks horrible, and also the performance is worse. But anyway I was able to raise it earlier, and now for some reason it doesn't work anymore.
 
I downloaded a certain PC testing program, and it requires higher resolution to run some 3D GPU test. There is no other reason. I never used higher resolution for other causes because it looks horrible, and also the performance is worse. But anyway I was able to raise it earlier, and now for some reason it doesn't work anymore.
Virtual super resolution isn't guaranteed to work or be stable.
 

joeblowsmynose

Distinguished
Why would you do that? You gain 0 improvements in image quality and you decrease your performance by close to half. Its called virtual super resolution with AMD cards.

I guess you've never tried it before ...

The game Elite Dangerous has some wickedly horrible aliasing that only the highest level of AA applied in super sampling mode can even remotely tame ... even at that level its still pretty bad, and the performance hit of all that AA is pretty high.

On a whim, I tried scaling up to 1440 from 1080. All the difference in the world. Everything is mind blowingly sharp, and I can get away with no AA or FXAA at 1440p -- because downscaling from a higher resolution is actually what AA does. Back in the olden days, that's exactly all AA was -- rendered higher then downscaled for your monitor - since then they've found ways to speed it up with fancy algorithms and edge detection, etc. but always at a loss of overall effectiveness and quality vs pure downscaling from a higher rendered res.

So with being able to use no, or very low AA at 1440p, vs full AA at 1080p, I only lose ~10% of FPS, of which I have headroom for anyway, and all my textures that got smoothed out during game engine and driver optimization processing, and everything is wicked sharp. Contrast is better, etc. Granted you need to have a bit of FPS headroom for the performance hit; and if image quality isn't something one cares about, then this probably isn't for them.

I've been doing this for years with Elite and have never had it not work or not be stable.

I can think of a few other use cases for it as well outside of gaming.


As far as OP request goes, I don't even know where that setting is with Nvidia cards so I can't be of much help.
 

Bogdan7585

Reputable
Feb 9, 2016
65
2
4,635
I guess you've never tried it before ...

The game Elite Dangerous has some wickedly horrible aliasing that only the highest level of AA applied in super sampling mode can even remotely tame ... even at that level its still pretty bad, and the performance hit of all that AA is pretty high.

On a whim, I tried scaling up to 1440 from 1080. All the difference in the world. Everything is mind blowingly sharp, and I can get away with no AA or FXAA at 1440p -- because downscaling from a higher resolution is actually what AA does. Back in the olden days, that's exactly all AA was -- rendered higher then downscaled for your monitor - since then they've found ways to speed it up with fancy algorithms and edge detection, etc. but always at a loss of overall effectiveness and quality vs pure downscaling from a higher rendered res.

So with being able to use no, or very low AA at 1440p, vs full AA at 1080p, I only lose ~10% of FPS, of which I have headroom for anyway, and all my textures that got smoothed out during game engine and driver optimization processing, and everything is wicked sharp. Contrast is better, etc. Granted you need to have a bit of FPS headroom for the performance hit; and if image quality isn't something one cares about, then this probably isn't for them.

I've been doing this for years with Elite and have never had it not work or not be stable.

I can think of a few other use cases for it as well outside of gaming.


As far as OP request goes, I don't even know where that setting is with Nvidia cards so I can't be of much help.
I reset my monitor settings and it's all good now. I must've changed something accidentally.
 
  • Like
Reactions: joeblowsmynose