Display, dot pitch, resolution, what to do.

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015
I bought a 22" (21.5") monitor and have had it over a year. I like it a lot, because it aces configuration tests and its colours are great for a budget monitor. However I often think a 24" monitor might be better.

My issue is all about dot pitch and aliasing/ picture clarity. If I move up to a 24", the monitor will still be in 1920x1080p, as is my 22". However the dot pitch will be lower and games run in 1080p will look less sharp.

In most cases I run games in 2560x1440p DSR scaled down to 1080p. Yes I would be able to do this with a 24" but the games that I can't, will look worse.

If I follow this on, and think look for a 2560x1440p native res screen, I have to go 27". That defeats the object because it lowers the dot pitch over a 24" that was in 1440p, though. However I can find monitors that run 24" in 1440p which I think would be ideal.

21" is only occasionally small, and I think 24" would suit me better than 27". Bascically then I want 24", 1440p, maybe 120Hz refresh, and IPS with minimal light bleed.

Does anyone have any ideas or am I looking at this wrong?
 
Here's the thing about resolution and size. Viewing distance is key. However, the higher the contrast, the sharper and better the colors as well as black and white will look. This is the only thing that high resolution monitors have over regular 1080p sets. They're doing the same thing to TV's right now. They're improving the picture quality of 4k, to make the jump from 1080p to 4k a lot easier for people to tell the difference (using store-mode as well), they're hoping they'll fool majority of your average consumers into thinking a higher resolution is what made the picture look better. This is absolutely not the case. None the less, 2.5k display at 27" is not useful when compared to a 1080p set at the same size, if you're about 3 feet away, which is the norm for computer use. Some people prefer to be a lot closer to them, but as for majority, it's 3 feet. However, same problem here. 2.5k displays will look a lot sharper and much better colors, because they simply stopped improving the "low" resolution 1080p displays. IPS is definitely the way to go, optionally in my opinion the best VA monitor out there in the consumer market: http://www.newegg.com/Product/Product.aspx?Item=N82E16824136126

 

That's easy. 22/24 = 0.917. So simply sit 91.7% the distance you normally sit from the 22" monitor, and that's what a 24" monitor will look like.

The linear dot pitch isn't what's important per se. It's the angular dot pitch that's important, and (if you do the trigonometry) that scales linearly with distance except at extreme angles. So sitting closer to the monitor will have the same effect (in terms of angular dot pitch size) as getting a bigger monitor.

In most cases I run games in 2560x1440p DSR scaled down to 1080p. Yes I would be able to do this with a 24" but the games that I can't, will look worse.

If I follow this on, and think look for a 2560x1440p native res screen, I have to go 27". That defeats the object because it lower the dot pitch over a 24" that was in 1440p, though. However I can find monitors that run 24" in 1440p which I think would be ideal.
DSR is basically anti-aliasing. You're rendering the image at larger-than-screen resolution, then downsizing it using a resampling algorithm. This resampling smooths out jagged lines. The only difference is that traditional anti-aliasing works with integer pixel scales (2x, 4x, 8x, 16x), or half-integer for the AA algorithms which draw at native resolution and pixel-shift for smoothing. Resampling OTOH works with any ratio, at the cost of being more mathematically intensive.

Anyhow, resampling or anti-aliasing are independent of angular dot pitch. If the dots are small enough that you can't really notice them, then any resampling or anti-aliasing will look good. But if you can easily spot the dots, then any resampling or anti-aliasing will look bad. Perhaps not as bad as no anti-aliasing, but there's no in-between here. Either it's good or bad. When it's bad (dot pitch is too large), you'll notice that the virtual pixel borders (at 2560x1440) don't match up with the monitor's pixels (at 1920x1080), and there's some wavering as the lines jump between the two. (This is tough to describe, but obvious after you've seen it a few times.)

(The jaggies that anti-aliasing attempts to reduce are caused by extraneous high-frequency information due to the pixels being square, instead of rounded blobs like back in the CRT days. The rounded blobs are actually a pretty good approximation of a gaussian point spread function, which is a good approximation for a "true" pixel. Pixels are really infinitely small dots, not squares. We just have to deal with square or rounded blob pixels because it's really, really hard to push light through an infinitely small dot.

The square pixels on a LCD monitor are just preferred for computer work because the vast majority of computer interface elements use perfectly horizontal and vertical lines. The "extraneous" high-frequency information square pixels introduce coincides with these horizontal and vertical lines, exaggerating their sharpness. You lose the exaggeration entirely when you rescale a non-native resolution, which is why Windows 8.x compatibility rescaling makes apps look blurry even though they're being rendered at a higher resolution.)

Anyhow, have you tried running at the monitor's native resolution with anti-aliasing, instead of DSR? I can think of a few cases where DSR would be better (especially cases which cause moire). But for general use, you may be better off with a 2560x1440 27" monitor with antialiasing instead of DSR.
 
Yeah with reference to distance from monitor, I find the standard desktop distance perfect. However My issue is that just sometimes the 22" (21.5") feels a little small. Therefor I want to up grade in size but not downgrade in dot pitch. Thus I would need a 1440p 24". Therefor I might as well bump up the refresh rate while I am at it.

@Suzuki, yeah I had almost forgotten about VA. I think I will stick with IPS though because since getting one I couldn't buy TN ever again.

@Solandri Uhm yeah I have run games in 1080p with full AA. However DSR looks better. It produces more detail sometimes, depending how the scaling is done by the coders of the game. I have found the best looking picture is 2560x1440 @ 1080p DSR with FXAA running. Games run in 1080p with AA tend to look a little crispy round the edges after having used 1440p DRS @ 1080p.

What it means is that there is more detail and sharpness potential on a 1080p monitor, than 1080p coding does. Since I am happy with 1440p on my 1080p monitor and want a little extra size, 24" seems the choice. However it should probably be 1440p. Otherwise as said games at 1080p on 24" will look worse than on 22".
 


I am not even sure what Solandris's post is even trying to tell me.

I understand perfectly what DSR is and thought that was obvious. I had already explained in my opening post why I thought going to 27" was pointless. It would acquire a higher resolution, since dot pitch becomes lower because of the size. Therefor I would need to use 4K DSR.

Jaggies are not caused by pixels shape. They are caused by coders of games not working down to individual pixel definition. Thus why AA or DSR cures jaggies. Otherwise jaggies would not go away with AA and it would be pointless.

This quote is the wrong war around: "That's easy. 22/24 = 0.917. So simply sit 91.7% the distance you normally sit from the 22" monitor".
It would be 24/22.

I know nothing of Windows 8.x compatibility rescaling, however I don't see any relevance because I understand DSR.


The options are, what do we do when we want more native resolution without sacrificing dot pitch. I have since found a 25" monitor which is tempting. The crux of the matter is why are there no higher resolution monitors at 22"-24". Hence what do we do. If there were higher res monitors, games that won't use DSR well would look better n native resolution.

The principle is obvious. My friend has a laptop which is 17.6" 1080p. It has a much higher dot pitch, than my 22" monitor, and mostly needs no AA.

Or to put it another way. 1440p on a 22" monitor will look the same as 1440p DSR on a 1080p monitor. Less jaggies, and potential for more detail. Why then would I go up to 27" and not want higher res than 1440p?
 

I'm not going to spend too much time on this because it's tangential to your original question. But yes, jaggies are (partly) caused by the pixel shape. As I said, the conceptual "pixel" is a rounded blob (meant to represent an infinitely small dot). CRT pixels were rounded blobs, which is why anti-aliasing wasn't as important on CRTs.

When you use square LCD pixels instead, the shape of the pixels introduce high-frequency noise. A sharp, square edge in Fourier space (frequency space) is represented by an infinitely growing series of high frequencies. This is what I meant by the square pixels adding high-frequency noise.
https://en.wikipedia.org/wiki/Gibbs_phenomenon

In the case of perfectly vertical and horizontal lines like in most computer UIs, this noise is coincident with the graphical elements, so it becomes invisible or makes the UI elements look even sharper than they conceptually are. But in diagonal lines and photorealistic scenes, it creates jaggies which aren't present in the virtual scene (the scene as it would appear if the pixels were infinitely small dots or rounded blobs).

Removing these jaggies involves running the virtual scene through a low-pass filter to remove these artificial high frequency elements. Basically mixing adjacent pixel values together to blur the image. Anti-aliasing does this at the pixel level, which is why it's limited to integer ratios. Resampling does this at an arbitrary ratio, at the cost of being more mathematically intensive. The blurred result effectively (or sometimes not so effectively) hides the jagged square pixel edges - removes that extraneous high frequency information. The final image is blurrier than your 1080p or 1440p monitor is capable of delivering, but looks more pleasing because the extraneous high frequency noise has been reduced or hidden. It's a trade-off between sharpness and less noise.

(The second cause of jaggies is the pixels being arranged in a grid, instead of being allowed to have any arbitrary location. The effect of this is greatly reduced if you draw your pixels as rounded blobs. But it's exacerbated when you have square pixels.)

Anyhow, I'm not trying to dissuade you as to whether DSR or AA looks better. Everyone's visual perception is slightly different, and little patterned flaws that I may not notice with AA may really bother you but are masked or removed by DSR. If you say DSR looks better to you, then DSR looks better to you, and moving to a 1440p 27" monitor would probably be a mistake since you'd no longer get the benefit of DSR (unless you moved to a higher render resolution with lower framerate).

So like I said in my first post, simply position your head at 91.7% your normal viewing distance from the 22" monitor. And that will be what a 24" monitor looks like at your normal viewing distance. Then you can decide if the pixel pitch is still small enough for your tastes at 1080p.
 


I see what you meant by that now.

Usually what people say is, get a bigger monitor and set it further back. I figured you were presenting that idea compared to a smaller monitor closer up.

1080p with full AA does look very tidy though as many of us will attest. I have played games like Painkiller Black Edition, with AA maxed out on this 22" monitor and it looked great. Better than it would ever have looked years ago when it would have made GPUs struggle, on older tech monitors.

I have no issue with pixel shape. When you have a game in 1440p DSR and AA on it looks very good. However a more densely packed monitor with tighter dot pitch would be even better. If there was a 22" in 1440p it would make a better picture than a 1080p monitor pulling 1440p DSR.

Since 24" is not that far up in size, that's better dot pitch than 27" to me. If only they made smaller higher res monitors. It's infuriating when you consider some mobile phones and tablets have higher than 1080p screens.
 


At what point would higher resolution become useless for PC monitors. I think 24" 4K is fine. They have released 8K.
 
Again, a monitor is a little bit bigger than your average phone display. However... The concept of diminishing returns is very real. Once you reach a certain distance away from a display, you can no longer make out the pixel structure. Once you reach this point, a higher resolution won't make things sharper, it c can't make things sharper. It's based on the physics of the human perception and our visual acuity average of 20/20 or 6/6. Were all different though, that's just the average.
 
I noticed when playing about with DSR is that 1440p and FXAA is generally the best looking picture on 1080p.

I found 4K scaled down to 1080p on 22" to be overkill, depending on the game. It brought too much detail to the picture and became a sensory overload. Either that or I was just not used to it.

An example is Serious Sam 3 becomes a little too soft and can be described as blurry looking. Since 4K is meant to be played on a larger monitor, they have to make more details I the 4K image. At 22" it was just too much detail sometimes. It might work better on a native 4K 24" though.

Anyway all that was by the by, so I was just saying it for anyone who is interested. Some games look better scaled to 4K. Aliens: Colonial Marines being one of them.
 
To be honest, I don't Think we're going to see any high res displays, they'll save those for the bit bigger displays. We can only hope, but I seriously doubt we're going to see that anytime soon, if ever. Aliasing is a real problem, and a higher resolution/DSR reduces that. And to be clear, I'm not arguing over that, just in general because I don't see majority of consumers wanting something like it, which is really what it comes down to. They need a market. But who knows. Major brands are currently pushing out 4k TV's and stopped improving the Picture quality that of 1080p displays to make the jump a lot more obvious.
 


Yeah it's a real surprise that manufacturers have not made higher resolution displays the norm. If all 24" were either 1440p or 1080p, people would be happy with the choice. As it is now, it's 1080p and that's it. Whereas we all know a higher resolution means better looking games.

I have found that Dell have released a 24" monitor in 1440p. http://www.tomshardware.com/news/dell-p2416d-qhd-monitor,28890.html

The Dell P2416D, and Tom's hardware say, at last, should have been here three years ago. I could not agree more. I was astonished when I was looking for my last monitor and found 1080p was the max for 22"-24". 18.4" monitors were max 1600x900. I could not believe it. I thought manufacturers would have jumped at gamers wanting high resolution displays. MY theory is manufacturers push consumers towards bigger screens to get high resolution, because they are more expensive. However bigger screen/higher resolution is not better pixel density, so it's a red herring.

Anyway I was playing Metro: Last Light tonight, on my 22" 1080p monitor. I use DSR with it at 1440p, with no AA. In a sense I get the best of both worlds, because it looks brilliant. I get high resolution on a small monitor. Soon some games I will not be able to run 1440p because the GPU will not be powerful enough. Eg The Witcher 3. However I can revert back to 1080p when needed.

There is also the Dell U2515H which by all accounts is a good monitor. 25" and 1440p is not a bad compromise. I seriously doubt I would ever want a 27", and I would not buy 27" just to get 1440p. Not when a 22" 1080p gives similar pixel density.

 
If they were to push 1440p as the new mainstream resolution, then how much would you have to spend to comfortable run a game at that resolution, with games constantly becomming more and more demanding. There's no room for an improvement that big. However, the TV market is nearly done with 1080p for good, they stopped improving 1080p, and planning to go for 4k HDR with the new HDMI 2.0a this Christmas for the new Blu-Ray players.

Services like Netflix's 4k streaming, doesn't even beat 1080p Blu-Ray, I can't personally wait until 4k is mainstream, with all the visual acuity science though, they're likely going to stick to bigger sizes, so is it really better, apart from picture quality due to improvements to everything else except resolution? That's a massive jump in terms of picture quality.

PC however, same old, same old. However, while true that DSR helps reduce the visible aliasing in games, keep in mind that in order to see the increase in DETAIL, a bigger screen is required. It's a very difficult question to answer, but I do honestly think 1080p will remain the PC "standard" resolution, for a very long time. It's clear that they focus more on software and new hardware such as GSync/FreeSync than things that actually help improve the picture quality, and aren't cash grabs. :)
 
I should mention that it's not true what you said about you need a bigger screen to see DSR improvements. At 1080p, with 1440p, in many games the improvement is in details which you can see. Edges of stuff looks less crisp and better defined. 1080p game resolutions can leave the image looking a little unfinished. Whereas going to 1440p finishes it off. 4K looks a little too soft sometimes.

If I had not seen 1440p though I would not have known how much better it could look at 1080p.

What you said about 1440p not being mainstream because people would not have the hardware. Well people buy 27" 1440p so they do have the hardware. SLI or top end GPU. We should be able choose what resolution we can run. Not be forced to buy a huge screen with no pixel density improvement over smaller lower res screens.

People are starting to think of 4K as the near future norm. 1440p is the almost the happy medium. 1080p is the most commonly used resolution. New screens will move 1440p forward, and I agree with Tom's Hardware. It is despicable that they have not done it before now. More compact dot pitch produces richer colours, and overall better images especially when zoomed in.

It's agreeable though that 1080p might hang on for a while since we can use DSR. I am thinking about that seriously, rather than buying again. I might find the Dell 1440p 25" a bit big. All to think about.

Maxwell has proved we can do more for less power and less heat. I say bring it on. The next gen cards will likely be similar architecture, in smaller fabrication, ie more transistors same power and same heat. Maybe faster RAM. Any future design boosts like Maxwell was and GPUs will be amazing.

I think you're right about G-sync. It's scary though when you start to think about 120Hz, or 144Hz, when using higher resolutions. The GPU power to run that is a lot. I'd love to see 120Hz, to know if I could not live without it. I am really happy though with 60Hz and adaptive V-sync.
 
Aliasing is not detail, aliasing is an artifact. DSR reduces aliasing which is an artifact, and you see more detail because it's trying to mimic the detail level you get from a 1440 resolution. I should have worded that better. 1440 and 4k however is definitely not the future for PC gaming. Game textures aren't even that detailed to begin with, if we compare it to real video/film. The game engines are limiting the detail big time. The market for 1440p gaming is probably 1% out of all gamers, this is a somewhat accurate number if we take a look at Steam's hardware survey, you'll see that 2560x1440 is currently used by 1.17%, and 3840x2160 only 0.07%. 1920x1080 is currently at 34.51% to contrast. PC gaming will definitely not jump on the train for 4k gaming just because the TV manufacturers are leaving 1080 behind. 1080 is here to stay, there's a huge market, especially now that DSR and VSR exist as well. 1440 gaming especially with future titles, and the requirement to upgrade more often than you would at 1080, it's just not worth it for your average consumer. In my opinion...
 
Yeah, where I said, "At 1080p, with 1440p, in many games the improvement is in details which you can see. -- I mean there is more detail.

I followed with "Edges of stuff looks less crisp and better defined." I mean just that, and it's not just down to the anti-aliasing effect being improved.

I still disagree, we don't need bigger screens to improvements of higher resolutions.


Yes. I know 1080p is the most common. I said so. However many of us are now using DSR so 75% of the time we are gaming in 1440p. 1440p is taking over.
 
Something that is baffling me about monitors is why we have jumped from 120HZ to 144Hz. I know more is better in a sense, but surely 120Hz is really very good. Pushing up to 144Hz just means more hardware resources needed.

Overall though like we have all been saying. 120Hz along with higher resolution is going to take some serious graphics power. Personally I would prefer to go with 120Hz over 144Hz. I really don't feel the need for just bigger figures. I read recently that even overclocking 60Hz monitors to 75Hz makes a change. (My monitor doesn't give me the option.)