Question Max Res for HDR + VRR on RX 5700 XT via HDMI?

jamesm113

Distinguished
Dec 6, 2012
55
1
18,535
I have a RX 5700 XT hooked up to my TV. TV only has HDMI input, no DisplayPort.

Since most games I play are 5+ years old, I'd like to run 1440p with HDR and a faster refresh rate than 60hz.

According to this chart: https://en.wikipedia.org/wiki/HDMI#Refresh_frequency_limits_for_HDR10_video, I should be able to get 100 hz (or maybe 120 Hz maybe in 4:2:0 mode... not sure what that is)

However, if I set my refresh rate to 100 Hz or higher, I cannot enable HDR in Windows. Sometimes, I notice that Windows reports the "Active Signal resolution" is 3840 x 2160. My TV also reports 3840 x 2160. Is that why I can't get HDR working, cuz it's actually outputting in 4k?

Has anyone with an RX 5700 XT gotten 100hz or 120hz working with HDR and VRR at 1440p via HDMI?
 
The issue is your GPU only supports HDMI 2.0, which does not have the required bandwidth for 4k HDR @ 120Hz. If you want 4k HDR you need to drop yo 60Hz and 4:2:0 chroma (which I will note makes text display *bad*, but is less visible in gaming). You can also drop to 1440p/120Hz, but still need to drop to 4:2:0 chroma. The best option if you want to maintain 4:4:4 chroma would be 1440p/100Hz, which the TV should support.

Failing that, you could upgrade the GPU to something that can output HDMI 2.1. Alternatively, you could use a Displayport to HDMI 2.1 converter, since Displayport doesn't have as much bandwidth limitations as HDMI 2.0.
 

jamesm113

Distinguished
Dec 6, 2012
55
1
18,535
The best option if you want to maintain 4:4:4 chroma would be 1440p/100Hz, which the TV should support.
Thanks for confirming this, I thought so too. However, I can't seem to get windows to actually output 1440p. IT seems to output 4k to the TV. Windows reports "Active Signal resolution" of 3840 x 2160.

Are there any downsides of dropping to 4:2:0 chroma?
 
In game? Not really. But it results in text rendering getting a *lot* worse, and it's already not great on the panel due to the pixel arrangement (I own a C2).

As for the resolution, I don't know the resolution won't go to 1440p; are you switching through AMDs driver or trying to have Windows do it?
 
  • Like
Reactions: jamesm113

jamesm113

Distinguished
Dec 6, 2012
55
1
18,535
Trying to have Windows do it. Couldn't figure out how to use the AMD driver to switch the resolution, though I could get it to switch the color profile, sometimes.

For whatever reason 1440p@120hz seems to work with VRR and HDR in 8b RGB mode. None of the other 1440p refresh rates work with HDR, because they all seem to actually output 4k.

Oddly enough, GPU scaling is disabled in the AMD tool, but if change the scaling mode to center, only a small pic displays on the TV.

So I suspect somehow the AMD driver has GPU scaling enabled even though it's configured to be off.

I might have some old DP to HDMI adapters laying around. Though I don't know if they are HDMI 2.1 or not, might give those a try to see if they make a difference.
 
There's a whole side discussion regarding 8-bit HDR, given there's a lot of faking going on. The general consensus is it isn't true HDR, but Windows allows it if nothing else. [I ran into much of these same issues waiting for the 3000 series to hit :p]

DP->HDMI 2.1 adaptors exist, just note most don't handle VRR that well.

I recommend always doing resolution/chroma changes through the driver if at all possible. Haven't owned an AMD card in ages (the only one I ever had died in three months and their driver utility was horrid), but someone else should be able to assist with that.
 
  • Like
Reactions: jamesm113