Is HDR (DCI-P3) worth the extra $750+?

Darksider13

Distinguished
Feb 10, 2014
81
0
18,660
I have the opportunity to get a nice 4K display with >120 Hz but said display uses sRGB.

I really would like a display that uses DCI-P3 (HDR) (I've been looking at the PG27UQ)but I see that not many exist and if they do, they're way out of my price range. Is HDR worth it to save the funds and just get the monitor I want later in the year or is P3 really not that huge a deal? The price differences between a non HDR monitor and a 4K sRGB monitor average around $700+.

Note: I have an iPhone 7 Plus which has DCI-P3 and tbh, the colors are good but they've never really blown me away.
 
Solution
Unless you do professional work with graphics, photo, or video and use programs which are color space-aware, it's usually best to stick to sRGB.

The problem is that nearly all web and video content is calibrated for sRGB. So when you view an image on a website, your computer and browser assumes it's sRGB, and displays it as if it's sRGB. If your monitor is sRGB, then all is good and you're seeing the image as intended.

But if your monitor is DCI-P3 or AdobeRGB or something with a larger gamut, those sRGB colors get stretched to fit the new color space. If the image has red values of 0-255 in sRGB, and a rose is 200 Red, it will display as 200 Red on the sRGB monitor. But if the DCI-P3 monitor has 130% the gamut of sRGB, then the...
Unless you do professional work with graphics, photo, or video and use programs which are color space-aware, it's usually best to stick to sRGB.

The problem is that nearly all web and video content is calibrated for sRGB. So when you view an image on a website, your computer and browser assumes it's sRGB, and displays it as if it's sRGB. If your monitor is sRGB, then all is good and you're seeing the image as intended.

But if your monitor is DCI-P3 or AdobeRGB or something with a larger gamut, those sRGB colors get stretched to fit the new color space. If the image has red values of 0-255 in sRGB, and a rose is 200 Red, it will display as 200 Red on the sRGB monitor. But if the DCI-P3 monitor has 130% the gamut of sRGB, then the 200 Red in the sRGB pic will map to 200 Red in DCI-P3, and will display as the equivalent of 260 Red in sRGB.

This is why the OLED screens on early Samsung phones made pictures look super-saturated. They were stretching sRGB pics to make them fit in the much larger color gamut of the OLED screens (about 160%-180% sRGB). If you've got a color space-aware app, it says "I need to display a sRGB pic on a DCI-P3 monitor, so that 200 Red in sRGB will map to 154 Red in DCI-P3", and it sends 154 Red to the monitor. And the red will be the same shade as it is on the sRGB monitor. Alas most programs don't know or care about color profiles, games in particular.

Most high-gamut monitors have a switch to put them into sRGB mode. That limits their color gamut to sRGB thus preventing the color stretching. But if you do that, you've paid for a high-gamut monitor but are only using the sRGB portion of its color capability. So it's a waste of money.

The only reason for a non-artist to get a high-gamut monitor is if you enjoy inaccurate but oversaturated colors. Hopefully one of these high-gamut color spaces will catch on and become the next standard in the future. sRGB is terribly limiting because it was selected based on the poor color gamut of early LCD monitors and TVs (NTSC is nearly 150% sRGB). But for now, most people are best off sticking with sRGB.
 
Solution

Darksider13

Distinguished
Feb 10, 2014
81
0
18,660
Thanks so much for your thorough response. I wasn't aware of the way DCI-P3 monitors translate colors made for sRGB. I'm sure in the future when we have the ability to have ways to map the same image to different color profiles without needing two different images, this will be less of an issue. But for now, I'm sticking with sRGB.

Now, to another inquiry I have is to whether or not G-Sync is worth the price cap. I've done some research over the past hour or so into G-Sync as a technology, it still isn't super clear to me if I would benefit much from the technology. I have a 5 year old 1080p LCD off-brand tv/monitor and with all the games I play, (from older games like Sims 1, Shadows of the Empire, etc to newer games like Doom, Battlefield 1 and so on...) I just turn V-Sync on if there's tearing and it eliminates the tearing and that's about all I care about. I honestly don't notice any latency/lag when v-sync is on as opposed to when it's off. The only time I could maybe notice some V-Sync lag is in games like No Man's Sky where I'll have really high frame rates in one region but if I turn two degrees to the right my frame rate is cut in half and I feel like there's a slight delay of input.

Any suggestions there?

Thanks again for your initial answer by the way!