News LG's 38-Inch Nano-IPS 170 Hz Ultrawide May be Landing on Shelves Soon

This looks like a great monitor with one critical exception... the contrast ratio is abysmal (but this isn't the monitors fault, it's just how current tech works).

At just 1000:1, we're still stuck at contrast ratios that are as low as some of the very first LCDs to come to market (my original 22" Dell back in 2007-2008 has a contrast ratio of 1000:1).

Nano-IPS also has severe IPS glow and some color distortion.

If you care about perfect accuracy in a first-person shooter, then I guess this can work for you, but it's awfully expensive. There are better options out there that sacrifice general display quality for speed at a much lower cost.

For me, I really want a quality display all around. The whole point is to be viewing something pleasing and that means it needs to have a solid contrast ratio. For the most part, only VA and of course OLED provide this right now (and they both have their own problems, especially for VA depending on who's manufacturing them).

P.S. - This isn't a 'near 4K' display. It's the same pixel density as a typical 27" at 1440p of about 110ppi. To get 'near 4K' you have to get closer to 150-170ppi and that'll be a 5120x2160 display for ultrawides.
 
Last edited:
This looks like a great monitor with one critical exception... the contrast ratio is abysmal (but this isn't the monitors fault, it's just how current tech works).

At just 1000:1, we're still stuck at contrast ratios that are as low as some of the very first LCDs to come to market (my original 22" Dell back in 2007-2008 has a contrast ratio of 1000:1).

Nano-IPS also has severe IPS glow and some color distortion.

If you care about perfect accuracy in a first-person shooter, then I guess this can work for you, but it's awfully expensive. There are better options out there that sacrifice general display quality for speed at a much lower cost.

For me, I really want a quality display all around. The whole point is to be viewing something pleasing and that means it needs to have a solid contrast ratio. For the most part, only VA and of course OLED provide this right now (and they both have their own problems, especially for VA depending on who's manufacturing them).

P.S. - This isn't a 'near 4K' display. It's the same pixel density as a typical 27" at 1440p of about 110ppi. To get 'near 4K' you have to get closer to 150-170ppi and that'll be a 5120x2160 display for ultrawides.

When people say things like "near 4k", they're usually talking about pixel count not pixel density. A 15" laptop with a 4k display has much higher pixel density than a 4k TV, yet they're both still 4k obviously. This has about 75% the number of pixels that a 3840 x 2160 display does.
 
  • Like
Reactions: TJ Hooker
When people say things like "near 4k", they're usually talking about pixel count not pixel density. A 15" laptop with a 4k display has much higher pixel density than a 4k TV, yet they're both still 4k obviously. This has about 75% the number of pixels that a 3840 x 2160 display does.

Yeah, they may be talking about resolution but what they really are getting at is density, even if that's not what they are thinking. Referring to "4K" only is not that accurate except in the situation you suggested, smaller displays. We need to focus on pixel density if "4K" is going to mean anything as a reference point for displays we're actually talking about.

For example, if I have a 1000" TV with a pixel density of the same, 110ppi or so, that'd be WAY more than a 4K screen by your reasoning, but of course we wouldn't say it's a "4K" screen, or 5K, or even 8K. It just isn't accurate.

Another example, if we had a display 4000 pixels wide but just, say, 1 pixel tall, that'd be a "4K" display too, or further to your point of the number of sheer pixels, then a display 11million pixels wide, but of course this is ridiculous. This example though points out the flaw in calculating purely on a single part of resolution.

This is partly why films and TV content are promoted in the height integer, 480p, 720, 1080p, 2160p, because if we're going to use resolution, generally the height is going to be a more realistic quantifier of the displays detail level (even though it's still massively dependent on the overall size of the display and how far you are from it).

What we should be focusing on in every situation is pixel density (at least until the density becomes so great that your eyes literally can't tell the difference). Knowing the pixel density tells you the true reference of a displays detail quality. No one really cares how many pixels a display has or even in what configuration, but how dense they are to each other. When people say "4K', "UHD", or any other acronym of resolution, that's what they are ultimately trying to get at.
 
Last edited:
For example, if I have a 1000" TV with a pixel density of the same, 110ppi or so, that'd be WAY more than a 4K screen by your reasoning, but of course we wouldn't say it's a "4K" screen, or 5K, or even 8K. It just isn't accurate.

Another example, if we had a display 4000 pixels wide but just, say, 1 pixel tall, that'd be a "4K" display too, or further to your point of the number of sheer pixels, then a display 11million pixels wide, but of course this is ridiculous. This example though points out the flaw in calculating purely on a single part of resolution.
Yes, that 1000" TV would still be 4K/8K/whatever the resolution is. No, no one would call a 4000x1 display 4K. In the context of consumer displays (TVs and monitors) everyone understands "4K" to mean 3840 x 2160 pixels. That isn't technically the correct term, and we should be using "UHD" but it's become the common term regardless. When people say "4K" they mean 4K resolution, not some arbitrary PPI that you consider to be "4K".

What we should be focusing on in every situation is pixel density (at least until the density becomes so great that your eyes literally can't tell the difference). Knowing the pixel density tells you the true reference of a displays detail quality. No one really cares how many pixels a display has or even in what configuration, but how dense they are to each other. When people say "4K', "UHD", or any other acronym of resolution, that's what they are ultimately trying to get at.
The article used the term "near 4K" with reference to gaming performance. In that case pixel density is irrelevant, pixel count is what matters.

I don't know why you're trying so hard to conflate resolution and pixel density, but there is nothing wrong with discussing resolution in isolation, e.g. as this article does when mentioning gaming performance.
 
Last edited:
Yes, that 1000" TV would still be 4K/8K/whatever the resolution is. No, no one would call a 4000x1 display 4K. In the context of consumer displays (TVs and monitors) everyone understands "4K" to mean 3840 x 2160 pixels. That isn't technically the correct term, and we should be using "UHD" but it's become the common term regardless. When people say "4K" they mean 4K resolution, not some arbitrary PPI that you consider to be "4K".


The article used the term "near 4K" with reference to gaming performance. In that case pixel density is irrelevant, pixel count is what matters.

I don't know why you're trying so hard to conflate resolution and pixel density, but there is nothing wrong with discussing resolution in isolation, e.g. as this article does when mentioning gaming performance.

Agreed, if performance is what they are concerned about, then that's a valid point on how many pixels there are, but this is atypical with monitors. Monitors only have a couple of options where performance matters regarding resolution, 1080p, 1440p, and 4K basically (yes there's more here but we don't need to get into those).

The sub-system (CPU/GPU/etc.) should be far more of a focus if you are worried about performance limitations. You can directly derive from that if your '4K' or 'near-4K' monitor will be a wise purchase. If we want to suggest that the performance needs of the sub-system come close to what would be a standard 4K monitor, then we really should be using something else and not even stating 4K at all (total pixel count as you suggest). That's where the confusion comes in, overlapping definitions to get an entirely different point across.

We have two considerations at the end of the day - pixel count and pixel density.

In the case the main point was lost in these other concerns, "near 4K" is not a 1440p system of any kind, even an ultrawide with a horizontal that happens to be close to a 4K's width. An ultrawide of this caliber is still 'just' 5 million pixels when a standard 4K screen is about 8 million, almost double the pixel count.

I'm simply pointing out the clarity that's needed to ensure there's no confusion. These are important topics that often are not well understood. Explaining these nuances can be beneficial to buyers that may not know about them. Tech geeks and most journalists in the industry usually understand this (and still wrestle with what to say at the time of a post, so this is by no means a 'knock' on the writer, I'm again just clarifying).
 
Last edited: