1080p on 4K screen

B0rTz1

Reputable
Jan 18, 2015
87
0
4,660
Hi. I have an ASUS PB287Q 4K display. After a few months of playing games in 4K, I was wondering how 1080p would look so I tried The Witcher 3 on 1080p. What I noticed is that the game looks pretty bad even on ultra with anti aliasing on. The colors seem a little washed out too. I tried 1440p too than it's a little better but still nowhere near the same quality as at 4K resolution. After trying other games I get the same results, they all look a little blurry, full of aliasing.

The games seem to look worse at 1080p on my 4K screen than I remember them at 1080p on a 1080p screen.

My question is, does it look so bad because I got too used to 4K, or maybe there is a loss in quality when trying to game at 1080p on a 4K screen that I didn't know about?


 
Solution
It depends on your monitor.

If your monitor simply scales lower resolutions using a nearest-neighbor algorithm, then 1080p will map 1 image pixel to 4 screen pixels, there will be no aliasing due to scaling, and it will look the same as a 1080p monitor (except each "pixel" will have a horizontal and vertical line through it). 1440p however will look like a mess with all sorts of aliasing artifacts since 1 image pixel does not map to an integer number of screen pixels.

If your monitor does any type of smoothing interpolation (bilinear and bicubic are the most common), then both 1080p and 1440p will look blurry. Anti-aliasing in the game will exacerbate this, so the image quality might actually be better if you turn off AA in the...

ebosss03

Reputable
May 1, 2018
478
17
5,015
Thats normal because 4k is four times 1080p that means that if you display 1080p on a 4k monitor, 1 pixel of 1080p will be displayed in 4 pixels in a square on a 4k monitor because you cant just "disable" those pixels.

Edit: if you run your game windowed in a corner of 1:4 then each pixel will be rendered normal. But that is super small.
 
It depends on your monitor.

If your monitor simply scales lower resolutions using a nearest-neighbor algorithm, then 1080p will map 1 image pixel to 4 screen pixels, there will be no aliasing due to scaling, and it will look the same as a 1080p monitor (except each "pixel" will have a horizontal and vertical line through it). 1440p however will look like a mess with all sorts of aliasing artifacts since 1 image pixel does not map to an integer number of screen pixels.

If your monitor does any type of smoothing interpolation (bilinear and bicubic are the most common), then both 1080p and 1440p will look blurry. Anti-aliasing in the game will exacerbate this, so the image quality might actually be better if you turn off AA in the game. If your monitor does this type of scaling, you can actually use to improve performance with a low-end GPU. You turn off AA in the game, then set the game resolution lower than the screen resolution (e.g. 1366x768 when you have a 1080p monitor). The monitor's scaling then acts as a pseudo-anti-aliasing algorithm with zero GPU cost. And the lower resolution is easier for the GPU to render.

Some monitors have an option to change the image scaling algorithm they use. Most however don't, and you're stuck with whatever the manufacturer made default.

If you're using a HDMI cable, some monitors assume a HDMI signal is video, and will overscan it (enlarge it slightly so the edges of the image are outside the screen). This will destroy the perfect 1:4 upscaling from 1080p to 4k. Scaling due to overscan is usually invisible on real-life video. But it can turn the razor-sharp lines in computer text and rendered images into a blocky mess. You will have to search the monitor settings to find an option to disable overscan.

https://en.wikipedia.org/wiki/Comparison_gallery_of_image_scaling_algorithms
 
Solution
May 5, 2020
1
0
10
This is infuriating. Scaling to an even multiple should NOT introduce aliasing. I can't believe that expensive gaming and design monitors don't have some intelligence for this. If you're on a 4k-native monitor and you set the screen resolution to 1080, no interpolation should occur.

Ugh.