1080p vs 1440p vs 4K

Legolas8181

Distinguished
Nov 18, 2013
730
0
18,990
For eveything from simple browisng to Call of Duty in 4K. What does everyone think of it?

And also, what would it look like 1080p on a 4K display, distorted at all?
 
i run a 4K HDTV @55". So i can say with certainty there is no distortion; however, things like anti-aliasing don't work nearly as well unless your at native resolution or very close. For me AA starts to show up at 1440P or higher on a 4K native panel. if your not huge into AA then absolutely no worries. Since 4K is literally 4 1080P panels (pixel wise) game/video scaling is perfect with a 4 to 1 ratio (ie 1 1080p pixel is 4 pixels at 4K)
 
Running at non native resolutions, such as 1080p on a 4k monitor can result in blurring and distorted images, not recommended.
At the moment 4k is not a good option due to the fact that one needs to invest ludicrous amounts of money into a rig in order to consistently power 60fps gameplay.
1440p 144hz is the best option if you have a single 1080 or 980ti/1070 SLI and want to get the most out of your PC, while 1080p 144hz is essentially for everyone with a 970/RX480 and above. Hell, even maybe a 960 power GPU will get use out of it.
 


only true in my experience when running AA...then it more of an issue that it looks like it isn't really on more then distorted image. So not sure I agree @ 1080p with 4K display.
 


TV resolution scaling is much better than what would be experienced with a monitor, because they frequently upscales 1080p content due to the fact that most TV content is not native 4k at the moment. This is another reason why good 4k TVs are very expensive, because the scaling technologies implemented are much better than those in monitors, which are intended to be run exclusively at 4k for the most part.
 


As mentioned below (not sure if you posted before I did! 😛) resolution scaling is much different on TVs compared to monitors, not to mention the input lag you may experience on such a panel due to the original intended use being viewing, not gaming which does not require a low response time to operate.
 
^my UHDTV doesn't scale the HDMI PC input on my particular set since i run 4:4:4 subsampling but i know of what you speak of. If i turn of 4:4:4 (which looks horrid IMHO), Then I can get upscaling to work with my set. And I also got a set with the lowest input lag at the time of my purchase...which for the record has zero to do with image scaling (with or without up-scaling) so not exactly sure why you brought it up unless to try and make it look like i don't know what I am talking about in regards to panels in general, which i assure you I do. Now not all panels are equal though i agree...some people running 1080P on 4K have zero issues while others do. Some of it comes down to quality of the panel you are using when you are not using native resolution. Point is though if you do your homework you should find a panel @4K that does not distort the image...again assuming no use of AA.
 

Yup, cost comes into all of this as well! :)
I brought up input lag in regards to a TV panel compared to a monitor 4k panel, which would have significantly more input lag due to not being designed for input peripherals such as keyboard and mouse, on which the lag will be quite noticeable compared to say a TN 4k monitor with lower input lag due to the nature of the respective panels and their intended purpose.

 


yeah sort of (on cost). I am on an UHDTV for 4K but he is talking about a monitor so more "not" on cost. Not to say 4K on smaller panels is less expensive then 1080P for same size. Clearly not the case. It's just cost was not the mentioned in post, image distortion is. Again not to say cost doesn't matter but not sure its a big focal point in the particular post (OP please correct me if wrong as it changes the advice given).

As far as input lag, you are very correct and is why I got a SET with such low input lag of 37 ms for 4:4:4 (rated in practice my set gets 30-34 ms) and if i am willing to live with 4:2:2 or less i can get 21ms rated (17-19ms in real world). Granted a good gaming monitor should me 1-2ms (assuming TN panel) kills my UHDTV but my numbers are not bad for a 55" curved screen, considering. I am very sensitive to input lag so I can see why you brought it up. But again if he is going for a monitor not a TV then it should generally be a non-issue even if not gaming eccentric panel (say he gets a bad 10ms panel, still kills my setup)

For the record I run a Samsung UN55JU7500 in case OP wants to look up any of what I just said spec wise.
 
I find 4k pointless in monitors, they're too small. Some argue that you should just enable DPI scaling, but that's a lame excuse for a pointless product.

DPI scaling not only introduces GUI bugs for programs, but it's also trying to simulate a lower resolution. Yes, you can use them with various programs that will allow you to change the size of text and icons and so on, some customization even natively.

4k gaming on the other hand is pretty sweet, but as long as text is involved, and the game wasn't designed at that resolution for small displays, you run into a problem. You might as well buy a magnifying glass when buying a 4k monitor - you're gonna need it.

Input lag in a TV isn't that bad. Even if 30 ms is noticeable to some people, using them with a mouse at a high refresh rate basically gets rid of that lag completely.

Now as far as 1080p on a 4k display is concerned. There's no short answer. As mentioned above, there are multiple ways to scale, but currently no display in the world is able to display 1080p on a 4k display without losing quality in the signal. There's no distortion however, you're not stretching the image more on the x-axis than y-axis. The aspect ratio remains the same.

Also, I wouldn't say that the primary reason TV's are expensive is because of the scaling algorithm used. I would say it's more about the panel specifications. Not all IPS are equal, not all VA are equal. Not all TV's are 60 Hz, majority of 4k TV's are currently 120 Hz natively, some even let you use them with a PC to output 1080p 120 Hz without frame skipping.

Basically, when you pay more for a TV, you know it (depending on how much you spend) has:

- More HDMI ports
- More formats supported (HDR10/Dolby Vision)
- Faster processing (smart TV)
- Higher contrast ratio (and black level) -- If VA
- Lower defect ratio (better uniformity, LED technique is full array, often local dimming as well)
- Lower input lag
- Higher refresh rate (faster response times and support for 24p playback etc.)

Just a few examples, but the same can't be said for a monitor. The second you factor in display size, you throw the argument of a monitor being better value compared to a TV out the window.



All the best!
 


extremely well put.
 

"Input lag in a TV isn't that bad. Even if 30 ms is noticeable to some people, using them with a mouse at a high refresh rate basically gets rid of that lag completely."
-------------------------------------------------------------------------------------------------------------------------
Input lag is not countered by high refresh rates full stop. If anything, you'll notice lag MORE at higher refresh rates due to the fact that the lag will look more prevalent as it is being displayed across more frames than it would be at 60hz, even though the input lag time remains the same.
Even though this would be present to a minimal extent, it is still a fact.
 


I took that a high refresh rates for the mouse and keyboard. In fact thats exactly what I have to help reduce input lag...now if he meant high refresh rate in regards to the TV....then yes you would be right. If he meant the refresh rate of the mouse/keyboard Suzuki is dead on. Though to be fair the terminology used can be confusing. I would have used polling rate in place of refresh in regards to mouse/keyboard considering he was speaking of panels and peripherals.
 
No I ment the refresh rate of the mouse, also known as polling rate.

I'm trying to improve my english, one day I'll make sense....

Now, if we're going to be that specific, then I'll gladly correct you and say that you confused input lag with response time. :)




All the best!
 


that was kind of aswesome. thank you for that!