Hello.
I was connecting my 4KTV to my laptop yesterday, and VRAM wasn't enough to run on 1440p for the specific game, RDR2, so I had two options to tweak and get 30fps,
1- Upscale the resolution to 1440p via AMD VSR while running 1080p native desktop, then set the screen resolution to 0.7(7/10).
2- Use native 1440p for both in-game and windows, then downscale the picture to 0.7(7.10)
Use sharpening for both solutions.
So, I didn't have my glasses on, but the performance and quality seemed the same to me, however, I really would like to know how downscaling and upscaling technology works.
Is it always better to run on native resolution and then start from there? Or just like in solution 1, rendering at 1080p and then upscaling it to 1440 but reducing the upscale resolution would be actually better? (kinda like fake DLSS2.0)
Thanks and Regards!
I was connecting my 4KTV to my laptop yesterday, and VRAM wasn't enough to run on 1440p for the specific game, RDR2, so I had two options to tweak and get 30fps,
1- Upscale the resolution to 1440p via AMD VSR while running 1080p native desktop, then set the screen resolution to 0.7(7/10).
2- Use native 1440p for both in-game and windows, then downscale the picture to 0.7(7.10)
Use sharpening for both solutions.
So, I didn't have my glasses on, but the performance and quality seemed the same to me, however, I really would like to know how downscaling and upscaling technology works.
Is it always better to run on native resolution and then start from there? Or just like in solution 1, rendering at 1080p and then upscaling it to 1440 but reducing the upscale resolution would be actually better? (kinda like fake DLSS2.0)
Thanks and Regards!
Last edited: