Question HDR turns itself off immediately after being toggled on (Windows 10) ?

Mar 26, 2024
10
2
15
Hello everyone!
For some background context, I've been troubleshooting and trying to get HDR working for 2 months now... I had bought a new TV, two new PCs, new GPU, new cables. I'm slowly ready to rip my hair out by frustration.... ...

Latest status: ordered brand new HDMI 2.1 cables to be 100% that the cables aren't at fault.
After noticing nothing happened on ubuntu I switched back to windows for the third time but for the first time on this new TV. Windows 10 automatically installed GPU drivers and after a short screen refresh i automatically finally had HDR10!
notif. From windows said it had automatically detected the display thus turned it on... radiating with happiness I went to get my HDD with HDR content to finally test it.
As I came back 15 minutes later, HDR was off, and I could not toggle it on in the settings no more. It is supported but as soon as I toggle it the screen flashes dark and HDR turns itself off after less than a second.

I had suspected the Windows updates had caused this, so I reinstalled win. Again, but this time the first thing I did was disable and block ALL Windows updates. HDR worked again... as I went into settings and changed the res. From FHD to 4k > HDR immediately turned itself off again. It couldn't be turned back on again, same as before.

What is causing this!?
Could someone please help find the source of this problem?
I've been working on my home cinema project for over a year now, and I find it ridiculous i have to troubleshoot a base feature of a high-end TV for 2 months… I would really like to finally finish this project.

TECHNICAL DETAILS /specifications:

  • PC:
current gpu: AMD RX6400 4GB HDMI 2.1 ITX (previous: gtx 1050TI ITX)
cpu: Intel I5 6500
mobo: Asus B150I PRO gaming aura
ram: 16GB DDR4 2133Mhz
psu: bequiet! poer 9 500W
SSD: 500GB crucial MX300 sata 2.5"
HDD: WD mybook 8TB (WD RED)
Windows 10 Pro


  • AV/R:
Onkyo TX-SR393 5.2
HDR / HDR10 / Dolby Vision / HLG
ARC


  • current TV:
Hisense E7KQ 75" QLED
2023
HDMI eARC
HDR / HDR10 / HDR10+ / Dolby Vision / HLG


  • previous TV:
Samsung UE58NU7179 LCD
2018
HDMI ARC
HDR / HDR10 / HDR10+ / HLG


in TV settings HDMI mode: "enhanced"
firmware: up to date
(otherwise apparently the TV cant achieve HDR)

I heard HDR auto-detection could be causing this
some more proprietary settings and formats from the TV manufacturer because "why not" :)...
And maybe also some Windows drivers that are being installed automatically?

All help appreciated and thank you in advance!
 
Mar 26, 2024
10
2
15
Hello everyone!
For some background context, I've been troubleshooting and trying to get HDR working for 2 months now... I had bought a new TV, two new PCs, new GPU, new cables. I'm slowly ready to rip my hair out by frustration.... ...

Latest status: ordered brand new HDMI 2.1 cables to be 100% that the cables aren't at fault.
After noticing nothing happened on ubuntu I switched back to windows for the third time but for the first time on this new TV. Windows 10 automatically installed GPU drivers and after a short screen refresh i automatically finally had HDR10!
notif. From windows said it had automatically detected the display thus turned it on... radiating with happiness I went to get my HDD with HDR content to finally test it.
As I came back 15 minutes later, HDR was off, and I could not toggle it on in the settings no more. It is supported but as soon as I toggle it the screen flashes dark and HDR turns itself off after less than a second.

I had suspected the Windows updates had caused this, so I reinstalled win. Again, but this time the first thing I did was disable and block ALL Windows updates. HDR worked again... as I went into settings and changed the res. From FHD to 4k > HDR immediately turned itself off again. It couldn't be turned back on again, same as before.

What is causing this!?
Could someone please help find the source of this problem?
I've been working on my home cinema project for over a year now, and I find it ridiculous i have to troubleshoot a base feature of a high-end TV for 2 months… I would really like to finally finish this project.

TECHNICAL DETAILS /specifications:

  • PC:
current gpu: AMD RX6400 4GB HDMI 2.1 ITX (previous: gtx 1050TI ITX)
cpu: Intel I5 6500
mobo: Asus B150I PRO gaming aura
ram: 16GB DDR4 2133Mhz
psu: bequiet! poer 9 500W
SSD: 500GB crucial MX300 sata 2.5"
HDD: WD mybook 8TB (WD RED)
Windows 10 Pro


  • AV/R:
Onkyo TX-SR393 5.2
HDR / HDR10 / Dolby Vision / HLG
ARC


  • current TV:
Hisense E7KQ 75" QLED
2023
HDMI eARC
HDR / HDR10 / HDR10+ / Dolby Vision / HLG


  • previous TV:
Samsung UE58NU7179 LCD
2018
HDMI ARC
HDR / HDR10 / HDR10+ / HLG


in TV settings HDMI mode: "enhanced"
firmware: up to date
(otherwise apparently the TV cant achieve HDR)

I heard HDR auto-detection could be causing this
some more proprietary settings and formats from the TV manufacturer because "why not" :)...
And maybe also some Windows drivers that are being installed automatically?

All help appreciated and thank you in advance!
UPDATE:
the exact same symptoms appear on windows 11:
HDR10 works perfectly until i switch from FHD to 4K:
then HDR turns itself off and cant be toggled back on even though supported. the screen flashes and it immediately goes back to disabled...

this leads me to believe theres maybe some kind of a bandwith limit??
BTW fun fact: played several videos with the windows media player while HDR was working and all FHD videos had audio but none of the 4K had audio.... no matter if it was HDR or SDR... which further leads me to believe this...
 
Mar 26, 2024
10
2
15
Its not clear; is the computer first connected to the AVR and the AVR then connected to the TV; or is the computer directly connected to the TV? Does this happen if the AVR is totally disconnected from the TV?
Yes as of now the PC is a input for AVR and ARC outputs to the TV...
Hadnt tested the PC hooked up directly to the TV yet because the AVR is a crucial part of the setup but will test though so that i can exclude it!
 
  • Like
Reactions: dwd999
Mar 26, 2024
10
2
15
Its not clear; is the computer first connected to the AVR and the AVR then connected to the TV; or is the computer directly connected to the TV? Does this happen if the AVR is totally disconnected from the TV? The official
OH wow.... yep... HDR works perfectly with 4k when PC is hooked up directly to TV... so how come it isnt when hooked up to the AVR?... The whole official product manual of the AVR states to: "support 4K/60p and HDR" am i getting something wrong ?
 
OH wow.... yep... HDR works perfectly with 4k when PC is hooked up directly to TV... so how come it isnt when hooked up to the AVR?... The whole official product manual of the AVR states to: "support 4K/60p and HDR" am i getting something wrong ?
Maybe they only tested HDR compatibility with blu-ray player outputs. So now, can you connect the TV's #3 HDMI eARC output to the receiver's HDMI input, does everything work correctly?
 
OH wow.... yep... HDR works perfectly with 4k when PC is hooked up directly to TV... so how come it isnt when hooked up to the AVR?... The whole official product manual of the AVR states to: "support 4K/60p and HDR" am i getting something wrong ?
There's your problem. The AVR only supports HDMI 2.0, and you are running into bandwidth limitations.

Yes, HDMI 2.0 *can* do 4k60 HDR...at 4:2:0 chroma:

https://en.wikipedia.org/wiki/HDMI#Refresh_frequency_limits_for_HDR10_video

So you change your resolution to 4k at 4:4:4/RGB, and the NVIDIA driver/Windows detect you don't have the bandwidth for HDR, and turns it off automatically. What you can then do is set your chroma to 4:2:0 within the NVIDIA control panel, then Windows will let you re-enable HDR again (as you are now within HDMI 2.0 bandwidth limitations).

That's why connecting directly to the TV works; you have HDMI 2.1 through the entire chain, and have zero issues running 4k60 HDR @ 4:4:4/RGB.
 
  • Like
Reactions: 35below0
Mar 26, 2024
10
2
15
Maybe they only tested HDR compatibility with blu-ray player outputs. So now, can you connect the TV's #3 HDMI eARC output to the receiver's HDMI input, does everything work correctly?
Yeah didnt know "hdmi splitting" was a thing... Yep it works! i now have 4k 30hz with HDR 10 and audio over the HDMII eARC... it seems a bit more scuffed audio and also video wise... but it should suffice for now...
Thank you for the suggestion!
 
Mar 26, 2024
10
2
15
There's your problem. The AVR only supports HDMI 2.0, and you are running into bandwidth limitations.

Yes, HDMI 2.0 *can* do 4k60 HDR...at 4:2:0 chroma:

https://en.wikipedia.org/wiki/HDMI#Refresh_frequency_limits_for_HDR10_video

So you change your resolution to 4k at 4:4:4/RGB, and the NVIDIA driver/Windows detect you don't have the bandwidth for HDR, and turns it off automatically. What you can then do is set your chroma to 4:2:0 within the NVIDIA control panel, then Windows will let you re-enable HDR again (as you are now within HDMI 2.0 bandwidth limitations).

That's why connecting directly to the TV works; you have HDMI 2.1 through the entire chain, and have zero issues running 4k60 HDR @ 4:4:4/RGB.
Wow! ok... yeah i hadnt thought about the RGB chroma.... F... but since i have an AMD gpu is there an alternative to the nvidia control panel?
And thank you for the detailed explanation!! :D