Hello everyone!
For some background context, I've been troubleshooting and trying to get HDR working for 2 months now... I had bought a new TV, two new PCs, new GPU, new cables. I'm slowly ready to rip my hair out by frustration.... ...
Latest status: ordered brand new HDMI 2.1 cables to be 100% that the cables aren't at fault.
After noticing nothing happened on ubuntu I switched back to windows for the third time but for the first time on this new TV. Windows 10 automatically installed GPU drivers and after a short screen refresh i automatically finally had HDR10!
notif. From windows said it had automatically detected the display thus turned it on... radiating with happiness I went to get my HDD with HDR content to finally test it.
As I came back 15 minutes later, HDR was off, and I could not toggle it on in the settings no more. It is supported but as soon as I toggle it the screen flashes dark and HDR turns itself off after less than a second.
I had suspected the Windows updates had caused this, so I reinstalled win. Again, but this time the first thing I did was disable and block ALL Windows updates. HDR worked again... as I went into settings and changed the res. From FHD to 4k > HDR immediately turned itself off again. It couldn't be turned back on again, same as before.
What is causing this!?
Could someone please help find the source of this problem?
I've been working on my home cinema project for over a year now, and I find it ridiculous i have to troubleshoot a base feature of a high-end TV for 2 months… I would really like to finally finish this project.
TECHNICAL DETAILS /specifications:
cpu: Intel I5 6500
mobo: Asus B150I PRO gaming aura
ram: 16GB DDR4 2133Mhz
psu: bequiet! poer 9 500W
SSD: 500GB crucial MX300 sata 2.5"
HDD: WD mybook 8TB (WD RED)
Windows 10 Pro
HDR / HDR10 / Dolby Vision / HLG
ARC
2023
HDMI eARC
HDR / HDR10 / HDR10+ / Dolby Vision / HLG
2018
HDMI ARC
HDR / HDR10 / HDR10+ / HLG
in TV settings HDMI mode: "enhanced"
firmware: up to date
(otherwise apparently the TV cant achieve HDR)
I heard HDR auto-detection could be causing this
some more proprietary settings and formats from the TV manufacturer because "why not" ...
And maybe also some Windows drivers that are being installed automatically?
All help appreciated and thank you in advance!
For some background context, I've been troubleshooting and trying to get HDR working for 2 months now... I had bought a new TV, two new PCs, new GPU, new cables. I'm slowly ready to rip my hair out by frustration.... ...
Latest status: ordered brand new HDMI 2.1 cables to be 100% that the cables aren't at fault.
After noticing nothing happened on ubuntu I switched back to windows for the third time but for the first time on this new TV. Windows 10 automatically installed GPU drivers and after a short screen refresh i automatically finally had HDR10!
notif. From windows said it had automatically detected the display thus turned it on... radiating with happiness I went to get my HDD with HDR content to finally test it.
As I came back 15 minutes later, HDR was off, and I could not toggle it on in the settings no more. It is supported but as soon as I toggle it the screen flashes dark and HDR turns itself off after less than a second.
I had suspected the Windows updates had caused this, so I reinstalled win. Again, but this time the first thing I did was disable and block ALL Windows updates. HDR worked again... as I went into settings and changed the res. From FHD to 4k > HDR immediately turned itself off again. It couldn't be turned back on again, same as before.
What is causing this!?
Could someone please help find the source of this problem?
I've been working on my home cinema project for over a year now, and I find it ridiculous i have to troubleshoot a base feature of a high-end TV for 2 months… I would really like to finally finish this project.
TECHNICAL DETAILS /specifications:
- PC:
cpu: Intel I5 6500
mobo: Asus B150I PRO gaming aura
ram: 16GB DDR4 2133Mhz
psu: bequiet! poer 9 500W
SSD: 500GB crucial MX300 sata 2.5"
HDD: WD mybook 8TB (WD RED)
Windows 10 Pro
- AV/R:
HDR / HDR10 / Dolby Vision / HLG
ARC
- current TV:
2023
HDMI eARC
HDR / HDR10 / HDR10+ / Dolby Vision / HLG
- previous TV:
2018
HDMI ARC
HDR / HDR10 / HDR10+ / HLG
in TV settings HDMI mode: "enhanced"
firmware: up to date
(otherwise apparently the TV cant achieve HDR)
I heard HDR auto-detection could be causing this
some more proprietary settings and formats from the TV manufacturer because "why not" ...
And maybe also some Windows drivers that are being installed automatically?
All help appreciated and thank you in advance!