Question Connected through HDMI 2.0 my RX 570 to Samsung Q60R, HDR at 12 bit makes colors dull

TLTH

Honorable
Jul 6, 2014
29
0
10,530
0
Hello. I wanted to test 4k (UHD) movies on this TV and one I got is encoded to 10 bit HDR as well. The colors of the video were to dark, and I figured perhaps on the SDR setting Windows 10 doesn't interpret the colors correctly.

After searching around I found out that HDMI 2.0 at 4k60hz only supports 10 bit HDR when the pixel format (chroma) is set to 4:2:2. So I did that in the Radeon settings and enabled HDR colors in Windows display settings, and then what happens is that Radeon shows me a color depth of either 8 bpc or 12 bpc. When it's left at 8 bpc the colors seem normal, but when set to 12 bpc the colors are washed out. Blacks don't seem as dark and the whites aren't as bright.

Perhaps it has something to do with the television itself, as it does change its settings mode when it gets 12 bpc. The backlight and contrast are automatically set to 50 (max), but the color reproduction doesn't seem right whatever I set the TV settings to.

I don't know if it's an issue with the Windows settings, the graphics card or the television? Why doesn't it allow for 10 bits? Maybe something gets compressed? I did try with 4:2:0 at 12 bits, which requires less bandwidth than HDMI 2.0 allows, but it still looks dull.
 

ASK THE COMMUNITY

TRENDING THREADS