[SOLVED] why HDR in windows 10 have washed out colors

Pururaj

Distinguished
Jun 30, 2014
86
0
18,630
i tried a lot of tweaks but hdr in windows is just broken, it makes colors washed and games look much worse. Also i've seen ubisoft games(far cry 5 and division 2) have a much better hdr implementation than the other games i've tested in hdr like anthem, rdr2 , cyberpunk, anthem.

To make the hdr in windows somewhat acceptable, i tried bumping the digital vibrance from nvidia control panel, but still the sdr looks much better. Anyone else having problems with windows hdr or is it just me? . i have also tried hdr in my 4k tv, it looks better in that when compared to my pc monitor which is 1080p. so what am i missing here
 
Solution
I don't use an HDR display, but from what I've heard, enabling HDR for the Windows desktop doesn't tend to work too well. A high-dynamic-range color space is wider than a standard range one, and for the vast majority of applications that are not HDR-aware, and don't explicitly output a wider range of colors and brightness levels, Windows apparently just maps their output to a limited portion of what the screen is capable of. And if the monitor itself is not physically capable of rendering a much wider range than a typical monitor, you end up with significantly worse color quality than a non-HDR display in any applications or content that is not specifically designed for HDR.

So, for most HDR screens and use-cases, it's generally best...
I don't use an HDR display, but from what I've heard, enabling HDR for the Windows desktop doesn't tend to work too well. A high-dynamic-range color space is wider than a standard range one, and for the vast majority of applications that are not HDR-aware, and don't explicitly output a wider range of colors and brightness levels, Windows apparently just maps their output to a limited portion of what the screen is capable of. And if the monitor itself is not physically capable of rendering a much wider range than a typical monitor, you end up with significantly worse color quality than a non-HDR display in any applications or content that is not specifically designed for HDR.

So, for most HDR screens and use-cases, it's generally best to keep the HDR setting in Windows turned off. I believe its better to activate HDR in your graphics card's control panel instead, if the option is available, as that should still allow HDR to activate in games that support it, without forcing all non-HDR content to be mapped to a limited portion of your monitor's range. Though as you've noticed, not all HDR implementations in today's games are created equal, and some may not handle the feature as well as others.

And that applies even more so for monitors. A lot of screens classified as "HDR 400" don't really provide much more range of brightness levels than a typical non-HDR display, so content designed to be mapped to a screen capable of displaying 1000+ nits of peak brightness is likely to not look its best on a screen that tops out at around 400 nits.
 
Solution