Does anybody have a good solution to scale 1080p onto a 4k monitor?

sm620

Honorable
Oct 18, 2012
196
0
10,710
I recently bought a 4k monitor thinking that I could always upscale 1080p to 4k without interpolation, but setting windows to 200% scaling looks blurrier than 1080p and still doesn't look perfect. I had better success in the GPU control panel, but it still looks blurier than the 1080p monitor that I have next to it. Theoretically it could use a square of 4 pixels to equal one pixel on the 1080p image.

Has anybody used any 3rd party software to solve this or maybe a solution through windows that I haven't tried? I've done some research, but I couldn't find any more solutions. The problem could be that my 4k monitor is 27 inches vs the 23 inch 1080p monitor, but it seems much blurrier when in 1080p even from the Nvidia control panel. The edges to windows and text look softer than on the native 1080p monitor. I also tried setting the DSR factor to 2x, but it didn't help. I looked through the monitor's settings and didn't find many postprocessing settings, but I will note that moving the sharpness slider(which I would usually turn off) to full it did improve a lot, but is still inferior to the native 1080p monitor

So if anybody has any ideas or solutions they would be appreciated
The specific monitor is LG 27MU58-B
and my GPU is a GTX 780
windows version is Windows 10
 

marksavio

Estimable
Dec 23, 2017
1,679
5
2,960
im pretty sure windows spports hi-DPI resolutions (as well as 4k for your case). whether upscale or downscale. you can also go to your monitors OSD menu to tweak sharpness/contrast/overdrive settings. but if you are referring to your installed softwares. now thats a different issue. not all programs were developed with anything higher than 1080p in mind. but you can try right-clicking on the app, go to properties. compatibility. override high dpi scaling. and just set whichever suits you most.

i do not know of an app that will universally adjust DPI through out all apps like that. and i dont think it will be very popular if there is one. much older software have already stopped development and you will be stuck with its relatively low dpi quality on your 4K.

if you are talking about movies/pictures. thats also a different issue. there are apps on the internet that does "advanced algorithms" on upscaling lower resollution pictures for you. as for movies, depending on hte movie app you are using. there are encoders that does a good job on upscaling the video on realtime.
 

sm620

Honorable
Oct 18, 2012
196
0
10,710


I have some programs and some games that don't scale at all, so for those I prefer to just set my whole display to 1080p, but the problem is that it ends up being blurrier than a normal 1080p monitor. I was hoping there would be a more efficient way to just treat everything as 1080p and have a simple upscale that doesn't degrade quality.

I understand for anything that has to scale in a non whole number multiple there will be degradation. I just thought that because it could use 4 pixels instead of 1 it could be lossless. for example (1920 times 2=3840) and (1080 times 2=2160). In that way 1920x1080 can be easily scaled to 3840x2160. I don't get why there is a lot of blurriness added when there should not be any complicated processing to scale the 1080p picture to 4k. my goal isn't to make the 1080p image look better because its on a 4k monitor. I just want it to look the same as it would on a 1080p monitor.

I think increasing the sharpness setting all the way on the monitor helped a lot, but I don't understand why it is so blurry without the sharpness processing being done on the monitor's side.
according to this https://www.crutchfield.com/S-8G2F43RuYFv/learn/learningcenter/home/tv_signalquality.html
the sharpness setting generally is edge enhancement on the monitor's end and not really designed for high quality inputs.
 

sm620

Honorable
Oct 18, 2012
196
0
10,710
I found this while researching
http://tanalin.com/en/articles/lossless-scaling/
included in that is a link to this
https://communities.intel.com/message/465784#465784

It seems like it hasn't been implemented for some reason. Unfortunately the currently used scaling methods for windows are just really bad. it gives me a headache to read the blurry text. It looks like there is a way to enable the integer ratio scaling in Linux, but it is not implemented into the driver well, so it doesn't work right in all situations due to the programming of the driver.

It feels like they tried to blur all the edges together to improve quality, but instead it makes everything look worse. my guess is instead of adding a special case into the drivers for integer ratio scaling coefficients, they have one method for doing all the scaling whether it is an integer or decimal. Its a problem because if my GTX 780 can't play a game on Ultra in 4k I might prefer to go back to 1080p rather than sacrificing shadow quality or ambient occlusion. So there probably is not a current solution for my problem, I hope this functionality is included in drivers/windows soon
 

marksavio

Estimable
Dec 23, 2017
1,679
5
2,960
dont fret much about it, man. :) im pretty sure alot of programmers esp in microsoft are aware of this monitor issue. esp when dealing with multimonitor setup. and its getting more obvious now with a higher range of resolutions and refresh rate pairings with different framesizes/dpi (damn those ultra wide screen ones looke dope tho). yah sometimes text/icons appear kinda jagged. what i usually do is do a "refresh" where i relogin myself back in to refresh the monitor settings. who knows? next week or a couple of months later. windows will release a patch. or nvidia will come up with a solution.

like i mentioned before. you just have to tweak it in the monitor OSD menu settings itself for now.

 

sm620

Honorable
Oct 18, 2012
196
0
10,710
hopefully they'll solve it soon. It looks like people have been requesting the integer scaling for a few years now and nothing's been done about it. The quality loss is significant when the scaling method makes what should be jagged edges into blends of color. Instead of sharp pixels I see blurs which are more distracting than the pixels would have been. The problem is most significant for pixel art, but fonts and icons are also types of pixel art.

I ended up adding myself to this change.org petition, but who knows if that will solve anything ( https://www.change.org/p/nvidia-amd-nvidia-we-need-integer-scaling-via-graphics-driver )