[SOLVED] HDR via TV

Just picked up a Vizio M558-G1 yesterday as my 'computer monitor'
it is stupid large... but I will do my best to cope lol. Replacing an older low-end 4k 40" Samsung 40D display.

Anywho, the picture overall looks pretty great. I think the reds need some adjustment, but I will sort that out over time as I get to learn the ins-and-outs of the TV.

My main question is... how on earth do you get a PC to display HDR? The standard picture looks fantastic, but I would love to play some HDR games (I have an old copy of ReCore to test with, which I think was the first HDR capable game?) and movies down the road if possible.

Setup:
GTX1080 - Latest nVidia drivers
Win10 Pro 1903, fully updated
i7 2600 (looking to upgrade soon to a Ryzen 3K series)
Vizio M558-G1 - Latest firmware update

Connection is going from DVI on the gtx1080 to HDMI2 on the TV. HDMI 1 is used for ARC out to a sound bar.
TV is set up in "Computer" mode which allows for full 4:4:4 input (big upgrade in clarity over the old 4:2:2 panel that this replaced!).
I have looked at a few forums about enabling HDR on win10/nVidia when going out to a TV, but the options mentioned do not appear to be available? Maybe a registry hack to bring the options forward?
When I try to enable HDR via the win10 Settings App for "games and movies" it acts as if it is going to enable, brings up a confirmation box on if I want to accept the changes, and then goes right back to 'off'.

Any help is appreciated!
CaedenV
 
Solution
your monitor doesnt supports hdmi 2.1+ so u cant use RGB 12bit
use 4:2:2 12bit in nvidia profile, disable that HDR crap in windows display settings

and thats pretty much it, HDR content will be still available (if game/app/youtube has it available) while windows wont be washed out
I would first ask you to connect to the display using your HDMI port, not the DVI port. Next if you haven't read this article, see if it helps you out.
I didn't even think that the cable could be the issue. Now that I think of it, I have had this DVI to HDMI cable for 10+ years... probably not rated for 'high bandwidth' applications.
I'll pick up a newer HDMI to HDMI cable and follow up... but now that I think on that I believe my GPU only has mini HDMI out... this could get annoying quickly lol
 
Update;
Not perfect yet, but much improved! DVI appears to be limited to HDMI 1.4 features.
HDMI out on the GPU unlocked a bunch of features where it can now do RGB 4:4:4, 8bit, and 'full' dynamic range which seems to be the key to making things work.
Then in Windows settings when I flip the HDR switch it sticks and stays how I expected it to.

A few things of note...
The Good:
OMG everything HDR is pretty!
-Games, even SDR games, pop the way I would expect. Fired up ReCore and the amount of detail is great! Need to find a few more HDR capable titles to play with.
-SDR movies look about how they always have (though the better quality panel is a nice addition!), and HDR movies look like you are watching through a window!

The Bad:
-So much "HDR" content available is not HDR, it is SDR encoded to an HDR brightness level... which is downright painful to watch on a 600nit screen. So viewer beware!
-SDR desktop applications, and the desktop itself looks absolutely terrible. It looks washed out, and as if there is some sort of scaling or like a grey mesh over everything. Solid colors literally look like they have a texture or pattern over them. The mouse, when over SDR apps and content, looks like it has a rainbow effect around it instead of a black line? Something definitely looks off or wrong about it.
-Going beyond 8-Bit color is not available over HDMI 2.0/2.0b (at 4k/60 4:4:4). To unlock this it looks like I need to go displayport out on the GPU to HDMI 2.1 on the display which I think will give me 10bit, but not the full 12 bit that the panel supports. This higher bit depth may help with the weird desktop artifacting/grey masking that I am seeing though. Will follow up if it does.
-After switching to HDR output on the PC, the display appears to be having some weird issues. It looses picture (and audio as I am going through ARC). I thought this was a signaling issue at first, but noticed that my controller would still vibrate in game, and everything appeared to still be running normal. I also started to notice that when it is like this some features of the on-display menu stop working. When trying to use the phone remote app, it would say that the display itself is not available. This happens on occasion when watching SD or HDR content, and happens a lot more when playing games and things are moving around a lot. Kinda wondering if perhaps I am crashing the in-display computer? Or maybe it simply is a cable/sync issue and it is devoting all resources to restoring the conneciton? It is weird.
-HDR on YouTube wants to render on my CPU. Being an older i7 2600, this completely peggs the CPU at 100% and drops more than half of the frames. It is pretty! But it is a slide-show. With Youtube had a 4k\30\HDR standard as I image that would work better. 1440p and below plays back smooth though and looks pretty fantastic. Need a newer CPU (or a future GPU?) to help off-load that render.

Anywho, next step is to go find me a Display Port to HDMI cable, and see if that helps things further. Will keep you posted
 
your monitor doesnt supports hdmi 2.1+ so u cant use RGB 12bit
use 4:2:2 12bit in nvidia profile, disable that HDR crap in windows display settings

and thats pretty much it, HDR content will be still available (if game/app/youtube has it available) while windows wont be washed out
 
  • Like
Reactions: CaedenV
Solution
your monitor doesnt supports hdmi 2.1+ so u cant use RGB 12bit
use 4:2:2 12bit in nvidia profile, disable that HDR crap in windows display settings

and thats pretty much it, HDR content will be still available (if game/app/youtube has it available) while windows wont be washed out
your monitor doesnt supports hdmi 2.1+ so u cant use RGB 12bit
use 4:2:2 12bit in nvidia profile, disable that HDR crap in windows display settings

and thats pretty much it, HDR content will be still available (if game/app/youtube has it available) while windows wont be washed out
Ok, will give that a try.
 
So, I thought that I was on 1903, but I was aparantly on 1809; after the update this week I gave the settings another go and things are substantially better in RGB mode, much happier now.
Still really hard to play HDR video on Youtube as it will only render on CPU, but after downloading a few demo HDR rips that play back on GPU I am extremely happy with the results!

... now to downgrade the firmware on my BluRay drives and start getting my own HDR content to work.