Which would provide higher quality, render scale or resolution, if any?

evan1715

Distinguished
May 26, 2011
188
0
18,710
Hey!

So, in a game like Overwatch, there's a setting for higher resolution than my standard as well as render scale.
Currently, it is at 1920x1200 (16:10) and the render scale is at 150%.
However, there's a resolution option for 2715x1697.

So, my question is: Would it be better (for quality) to run it at 1920x1200 @ 150% or to run it at 2715x1697 @ 100%?
I get basically the same performance/FPS at both settings.
 
Solution
It depends on your monitor's scaling algorithms. If you feed it a 2715x1697 signal, how does it scale that image to its native 1920x1080 resolution? Some have pretty bad scalers and the result will either look pixelated or really blurry. Some have very good scalers (bicubic or Lanczos), and look as good as if your computer does the scaling.
https://en.wikipedia.org/wiki/Comparison_gallery_of_image_scaling_algorithms

If your monitor has a bad scaler, setting render scale to 150% will probably look better. If your monitor has a good scaler, 150% render scale will probably look nearly identical to feeding the monitor 2715x1697 (since the game is probably also using bicubic or Lanczos scaling). And I'd run it with 150% render scale...
It depends on your monitor's scaling algorithms. If you feed it a 2715x1697 signal, how does it scale that image to its native 1920x1080 resolution? Some have pretty bad scalers and the result will either look pixelated or really blurry. Some have very good scalers (bicubic or Lanczos), and look as good as if your computer does the scaling.
https://en.wikipedia.org/wiki/Comparison_gallery_of_image_scaling_algorithms

If your monitor has a bad scaler, setting render scale to 150% will probably look better. If your monitor has a good scaler, 150% render scale will probably look nearly identical to feeding the monitor 2715x1697 (since the game is probably also using bicubic or Lanczos scaling). And I'd run it with 150% render scale just so the monitor doesn't have to do a scan rate change flipping between a 1080p desktop and 2715x1697 game. But OTOH you may want to run it at the higher resolution to save your GPU a tiny bit of extra work (scaling the 150% larger render down to 1080p).

And the previous post is correct that render scale is pretty much the same thing as anti-aliasing. The difference being that AA algorithms are designed to operate on integer multiples of the original resolution (so 2xAA would be 1920x2160, 4xAA is 3840x2160, etc) since computers are usually a lot faster at integer math. But image scaling is a common enough function that most GPUs have hardware scalers nowadays, and those algorithms are generic enough to function with non-integer ratios.
 
Solution