[SOLVED] USE HDR problem with Windows 11 and 3080Ti gaming laptop

Swampthing

Distinguished
May 19, 2008
9
0
18,510
I posted this question in the news commentary thread, but I think it would be better if repost is here as I have a new problem with Windows 11. When I plug in my HDMI connection for my LG C1 tv, it sees the tv as a second display. No problem. But I would prefer to play with full 4K and HDR on my tv, since my laptop screen does not have 4K or HDR. So I use the SECOND SCREEN ONLY and it's fine. But once I go back into the settings and click the switch to USE HDR, the tv screen disconnects and all I get is screen one on my laptop monitor. And then I can't turn the USE HDR off, because it says display not active... I have an XSX and PS5, as well as a 4K apple tv, and they all run fine in 4K HDR on this tv.

As background, I was using my Asus G733QSA ROG Strix 17 without any of these connection problems at all. And I set the SECOND SCREEN to the 3840 tv resolution and HDR.

But I recently upgraded to the new Asus ROG STRIX SCAR 17 SE wwith Windows 11, and now I have the problem. The new laptop has a 2560x1440 resolution at 240Hz. The old laptop had a resolution of1080p at 60Hz. My LG C1 operates at 3840 resolution with full 4K and HDR.

There isn't anything wrong with the HDMI port, as everything works fine initially. And I can still use it to duplicate displays. But when I click on USE HDR in Windows 11, bam... it disappears. I just want to be able to pipe my new laptop to my LG C1 and have full 4K and HDR.

Any suggestions? I am at a loss how to fix this problem. Tried uninstalling drivers, reinstalling, unplugging/replugging HDMI, etc. Nothing seems to work.
 
Solution
i had a lot of issues with Windows and HDR when i first started using it with an older MSI Optix 1440p 165Hz HDR monitor a while ago.
and again later using my LG CX 3840p 120Hz HDR tv.

many times Windows couldn't get the colors right, would claim HDR was initiated in settings but wouldn't actually take effect, would often just pop out of HDR mode for no reason, etc.

some facing similar issues are due to lower quality cables that can't provide stable bandwidth at those frequencies.
others just report that it's a common issue with Windows itself, like it had been with me.

after my 3rd or 4th fresh OS install i have finally gotten 3840p 120Hz 10bit HDR to work perfectly with my CX with no hardware changes whatsoever.
whether this has...
i had a lot of issues with Windows and HDR when i first started using it with an older MSI Optix 1440p 165Hz HDR monitor a while ago.
and again later using my LG CX 3840p 120Hz HDR tv.

many times Windows couldn't get the colors right, would claim HDR was initiated in settings but wouldn't actually take effect, would often just pop out of HDR mode for no reason, etc.

some facing similar issues are due to lower quality cables that can't provide stable bandwidth at those frequencies.
others just report that it's a common issue with Windows itself, like it had been with me.

after my 3rd or 4th fresh OS install i have finally gotten 3840p 120Hz 10bit HDR to work perfectly with my CX with no hardware changes whatsoever.
whether this has been due to updates within Windows or graphics drivers i couldn't pin it down and say but nothing of this sort stood out as the reason.
when I click on USE HDR in Windows 11, bam... it disappears
you suffer the same issue with lack of HDR when using only the C1 as a primary screen and the dedicated laptop screen disabled?

and are you sure you a have a high quality HDMI 2.1 cable that works fine with other HDR devices?
 
Solution

Swampthing

Distinguished
May 19, 2008
9
0
18,510
i had a lot of issues with Windows and HDR when i first started using it with an older MSI Optix 1440p 165Hz HDR monitor a while ago.
and again later using my LG CX 3840p 120Hz HDR tv.

many times Windows couldn't get the colors right, would claim HDR was initiated in settings but wouldn't actually take effect, would often just pop out of HDR mode for no reason, etc....

...and are you sure you a have a high quality HDMI 2.1 cable that works fine with other HDR devices?

Yes, I use the high quality Zeskit 8K HDMI 2.2 cables for my XSX, PS5, Apple TV, and PC. None of the other devices have any problem with 4K HDR. And don't forget the odd part... everything works perfect with my G733QSA. I get full 4K HDR using the same exact cable. That's why I figured it has something to do with the way Windows 11 HDR, the laptop settings and an LG C1.

Right now I am using it to broadcast 4K to the tv as a second screen only. But if I were to switch on the HDR in Windows 11, it immediately drops broadcasting and goes back to the laptop screen.
 
if I were to switch on the HDR in Windows 11, it immediately drops broadcasting and goes back to the laptop screen.
you didn't reply if you have tried disabling the onboard laptop screen and using the C1 as the primary display.
if it is even possible with your laptop, many have claimed this was the only way to get it to work correctly for them in a similar scenario.
 

Swampthing

Distinguished
May 19, 2008
9
0
18,510
Looking at the spec page for that model number this range has HDMI 2.0b and not 2.1. That would be the problem https://rog.asus.com/uk/laptops/rog-strix/2021-rog-strix-scar-17-series/spec

Possibly, but you're looking at the wrong model. I have the Strix Scar 17 SE. Here are the specs: ROG STRIX SCAR G733CX-XS97 and it has the new intel i9-12950HX, 3080Ti, and HDMI 2.1 port.

My older computer, the G733QSA-XS99 works just fine with the same cable and tv. It has a HDMI 2.0b port. But keep in mind that HDMI specs are downward compatible, just like USB. A 2.1 or 2.2 cable will work just fine with a 2.0b. And it does on the that laptop.
 
Last edited:

Swampthing

Distinguished
May 19, 2008
9
0
18,510
you didn't reply if you have tried disabling the onboard laptop screen and using the C1 as the primary display.
if it is even possible with your laptop, many have claimed this was the only way to get it to work correctly for them in a similar scenario.

I haven't tried disabling it through Device Manager if that's what you mean. Keep in mind it is displaying full 4K and shuts off the laptop screen. It's just the Display Settings switch "USE HDR" which causes it to quit. So I get a 3840x2160 screen.

I'm wondering if it might have something to do with the Hz refresh rate. The laptop can use either 240 or 60. And the LG C1 can go up to 120Hz. When I am using it without the HDR setting turned on, it is outputting 60 from the laptop.
 
Last edited:
Possibly, but you're looking at the wrong model. I have the Strix Scar 17 SE. Here are the specs: ROG STRIX SCAR G733CX-XS97 and it has the new intel i9-12950HX, 3080Ti, and HDMI 2.1 port.

My older computer, the G733QSA-XS99 works just fine with the same cable and tv. It has a HDMI 2.0b port. But keep in mind that HDMI specs are downward compatible, just like USB. A 2.1 or 2.2 cable will work just fine with a 2.0b. And it does on the that laptop.
Ah fair enough if it’s a different model. That one does say version 2.1.
 
When I am using it without the HDR setting turned on, it is outputting 60 from the laptop.
that's a bit odd.
the system and/or graphics card should read the display properties live as available.

it really seems this system just is not configured/prepared to run the type of setup you have in mind.

maybe just return it and look for something with documentation stating it can handle your proposed environment?
 

Swampthing

Distinguished
May 19, 2008
9
0
18,510
that's a bit odd.
the system and/or graphics card should read the display properties live as available.

it really seems this system just is not configured/prepared to run the type of setup you have in mind.

maybe just return it and look for something with documentation stating it can handle your proposed environment?

Not configured/prepared to run the setup I have in mind? :ROFLMAO: The Asus ROG Strix SCAR 17 SE is rated as the most powerful gaming laptop in the world right now with a 12th Gen Intel Core i9-12950HX Processor (8 P-cores and 8 E-cores) with 30M Cache and up to 5.0GHz clock speed .

Yet none of that power matters, as using the HDMI port for output to a 4K HDR device is quite possibly the simplest thing you can ask of any computer. It should work out of the box. And it does... to an extent. It outputs 4K, not HDR, but I think it's tied to a Windows 11 settings problem.
 
And it does... to an extent. It outputs 4K, not HDR, but I think it's tied to a Windows 11 settings problem.
obviously it doesn't if it cannot properly read and output the display's native refresh rate or 10bit HDR color.

whether it's an issue with the hardware itself or with the OS you will have to determine yourself.
as i already explained, Windows is known to have odd issues with HDR but unless you're willing to go through troubleshooting it and possibly reinstalling the OS you may as well just return the system.
 

Swampthing

Distinguished
May 19, 2008
9
0
18,510
obviously it doesn't if it cannot properly read and output the display's native refresh rate or 10bit HDR color.

whether it's an issue with the hardware itself or with the OS you will have to determine yourself.
as i already explained, Windows is known to have odd issues with HDR but unless you're willing to go through troubleshooting it and possibly reinstalling the OS you may as well just return the system.

I think you'll recall I already did every single one of those things. But I finally found the solution... and now my PC is outputting 4K HDR 120Hz. It was a combination of sorts. Check out this video:
View: https://youtu.be/bpsUwmSOoMk