Question RTX 4090 and HDMI 2.1

Aug 24, 2022
46
9
35
I have a 4k 144hz monitor.

But I heard 4090 only supports DP 1.4 (4k 120hz). However, 4090 also has HDMI 2.1 port, which supports 4k 144hz.

Has anyone tried the 4090 with HDM 2.1 yet? so we can set it up to 4k 144 instead of 4k 120.
 

Eximo

Titan
Ambassador
Does your monitor support HDMI 2.1? They are pretty rare so far, and you have to be careful with how they label things.

DisplayPort 1.4a is capable of around 120hz at 8 bit color

HDMI 2.1 should be able to do 10 bit 4K 144hz, I don't see why it wouldn't work with a capable monitor.
 
  • Like
Reactions: Roland Of Gilead
Aug 24, 2022
46
9
35
Does your monitor support HDMI 2.1? They are pretty rare so far, and you have to be careful with how they label things.

DisplayPort 1.4a is capable of around 120hz at 8 bit color

HDMI 2.1 should be able to do 10 bit 4K 144hz, I don't see why it wouldn't work with a capable monitor.

Yes, my monitor supports hdmi 2.1
It says this:
HDMI 2.1 (support 4K 144Hz 4:4:4 @ DSC enabled or 4K 144Hz 4:2:0) x2
Display port 1.4 (DSC) x1

I leave its full specs here:
https://www.gigabyte.com/Monitor/M32U/sp#sp
 
Aug 24, 2022
46
9
35
For the highest refresh possible, yes.
As long as the GPU, cable, and monitor support HDMI 2.1 you're good.

Thanks. I have been using the HDMI 2.1 with my 4090 and it is working great so far. I however do encounter the monitor not reading the cable sometimes. So I have to disconnect and reconnect the cable.
 
Thanks. I have been using the HDMI 2.1 with my 4090 and it is working great so far. I however do encounter the monitor not reading the cable sometimes. So I have to disconnect and reconnect the cable.
That's odd.

Try this to test. Right click desktop - display settings - advanced display settings - Do you have any odd number options? (like 119.880Hz)
If yes, try going down to that and testing.
 
Aug 24, 2022
46
9
35
That's odd.

Try this to test. Right click desktop - display settings - advanced display settings - Do you have any odd number options? (like 119.880Hz)
If yes, try going down to that and testing.

Wow, Now you mentioned it and I didn't notice this before. My monitor is Gigabyte M32U 4k 144hz. However, in this setting, it only shows 143.998hz. I didn't know it meant anything.

What happens if I do have odd numbers?
 
Wow, Now you mentioned it and I didn't notice this before. My monitor is Gigabyte M32U 4k 144hz. However, in this setting, it only shows 143.998hz. I didn't know it meant anything.

What happens if I do have odd numbers?
The decimal number is a multiple of the left over, legacy analog NTSC 59.94Hz standard. Digital panels nowadays are able to do 119.880Hz AND 120Hz without any issue. Just make sure you are using the same settings throughout and it should be good.

I just got the same monitor. Spent the whole afternoon tweaking it. ;)
Don't go down to just 143.998Hz, go down a substantial amount, like to 119.880Hz. Also, are you using RGB 4:4:4 or YCbCr 4:4:4? If using RGB, try YCbCr.
What we need to do is lower the peak bandwidth to see if your issue goes away.
 
Last edited:
The decimal number is a multiple of the left over, legacy analog NTSC 59.94Hz standard. Digital panels nowadays are able to do 119.880Hz AND 120Hz without any issue. Just make sure you are using the same settings throughout and it should be good.

I just got the same monitor. Spent the whole afternoon tweaking it. ;)
Don't go down to just 143.998Hz, go down a substantial amount, like to 119.880Hz. Also, are you using RGB 4:4:4 or YCbCr 4:4:4? If using RGB, try YCbCr.
What we need to do is lower the peak bandwidth to see if your issue goes away.
No, smaller deviations like this are likely due to pixel clock rounding as specified by the CVT-RB standard, not from the NTSC 1000/1001 thing.
 
Incorrect.

View: https://www.youtube.com/watch?v=mjYjFEp9Yx0&t=379s

There are other sites explaining it as well but 5 mins of this gets everyone up to speed.
If you don't get the timestamp start at 6:20. (or watch the whole thing - very informative)
I'm aware of the NTSC standard. It is one (but not the only) possible reason for odd refresh rates. It would decrease the frame rate by a fixed factor of 1000/1001 and generally is only applied to specific TV-related frame rates as defined in CTA-861, namely 24, 30, 60, 120, and 240 FPS. Even if it were applied to 144 Hz for some reason, we would expect the result to be 143.856 Hz. But that's not what we find.

143.988 Hz is more likely due to rounding the pixel clock to the next lowest multiple 0.25 Mpx/s as specified by the CVT-RB standard.
 
I'm aware of the NTSC standard. It is one (but not the only) possible reason for odd refresh rates. It would decrease the frame rate by a fixed factor of 1000/1001 and generally is only applied to specific TV-related frame rates as defined in CTA-861, namely 24, 30, 60, 120, and 240 FPS. Even if it were applied to 144 Hz for some reason, we would expect the result to be 143.856 Hz. But that's not what we find.

143.988 Hz is more likely due to rounding the pixel clock to the next lowest multiple 0.25 Mpx/s as specified by the CVT-RB standard.
Yeah. I disagree. The reason for the odd decimal numbers that we see in the OS is due to legacy NTSC.
 
Aug 24, 2022
46
9
35
Did you actually watch the video?
Also, if you knew NTSC you would know that the 59.94Hz is itself an estimation/approximation, so you can't just double it and say, "Look. The numbers don't match."

Edit - This is getting OT. If the OP needs anything further, I'm gonna let them reply back.

Thanks for the information you have provided. I really really appreciate it. It is also amazing that we own the same monitor. While most of the technical language here is beyond my ability to understand, it is very informative. I will dig deeper to understand it more when I have more time.

"Also, are you using RGB 4:4:4 or YCbCr 4:4:4"

What are the differences between the two? the benefits?

btw, I have no idea how to switch between the two. Please kindly provide guidance.
 
Last edited:
Thanks for the information you have provided. I really really appreciate it. It is also amazing that we own the same monitor. While most of the technical language here is beyond my ability to understand, it is very informative. I will dig deeper to understand it more when I have more time.

"Also, are you using RGB 4:4:4 or YCbCr 4:4:4"

What are the differences between the two? the benefits?

btw, I have no idea how to switch between the two. Please kindly provide guidance.
I switch between them in the Radeon software. You can do the same in NVIDIA settings.

On paper RGB is better, but when using 4:4:4 YCbCr they're close to visually identical. RGB supposedly gives less eye strain, darker darks, and lighter lights though. YCbCr uses less bandwidth, which is what we want to test to see if it fixes your issue.
 
Thanks for the information you have provided. I really really appreciate it. It is also amazing that we own the same monitor. While most of the technical language here is beyond my ability to understand, it is very informative. I will dig deeper to understand it more when I have more time.

"Also, are you using RGB 4:4:4 or YCbCr 4:4:4"

What are the differences between the two? the benefits?

btw, I have no idea how to switch between the two. Please kindly provide guidance.
Can you go to Windows Display settings, go to Advanced Display Settings near the bottom, and give us a screenshot of the info there? (Example)

You can adjust color format in the NVIDIA control panel under Display -> Change resolution, at the bottom (Example)

For black screen issues in this situation, my first suspicion would be DSC, which is a compression method used to transmit 4K 144 Hz since the M32U doesn't support the full 48G bandwidth of HDMI 2.1. Other monitors that implement DSC also have reports of random black screens and similar issues. Some people blame it on NVIDIA drivers, so maybe it may improve in the future with updates. Hard to say though, it's mostly just speculation at this point. Lowering the bandwidth as Alceryes suggested, by enabling chroma subsampling or lowering the refresh rate, might help alleviate the issue if that's what it is. But of course you aren't getting the full benefit of the monitor in that case.

On paper RGB is better, but when using 4:4:4 YCbCr they're close to visually identical. RGB supposedly gives less eye strain, darker darks, and lighter lights though. YCbCr uses less bandwidth, which is what we want to test to see if it fixes your issue.
Just a note, YCbCr 4:4:4 uses the same bandwidth as RGB; it only uses less bandwidth when enabling chroma subsampling (4:2:2 or 4:2:0).
 
Last edited:
Just a note, YCbCr 4:4:4 uses the same bandwidth as RGB; it only uses less bandwidth when enabling chroma subsampling (4:2:2 or 4:2:0).
Interesting. I thought that even at 4:4:4, YCbCr couldn't get as dark or as light as RGB across the color spectrum.
Now I gotta go open some books. ;)

Edit - For the OP, if you don't mind lowering your refresh to 120/119.880Hz, then you should be able to go full RGB 4:4:4 with 10 bits per channel color (depending on your cable and GPU). That's what I have mine set to and am VERY happy.

Also, a warning about the sRGB mode on this M32U monitor, it locks out many other settings, including overdrive. It's great for desktop use but is unusable in gaming due to horrible pixel overshoot. You can see it as the hardening/outlining of objects in games when moving. As soon as you stop moving the object outlines become soft again.

I ended up having to create a custom mode that almost mirrors the sRGB settings (as best I can tell) but uses the 'Picture Quality' overdrive setting. I've found this to be the best for both desktop and gaming.
 
Last edited:
Aug 24, 2022
46
9
35
I switch between them in the Radeon software. You can do the same in NVIDIA settings.

On paper RGB is better, but when using 4:4:4 YCbCr they're close to visually identical. RGB supposedly gives less eye strain, darker darks, and lighter lights though. YCbCr uses less bandwidth, which is what we want to test to see if it fixes your issue.

I have no idea how to switch to 4:4:4 YCbCr even after googling. But the current setting is good so far anyway. However, I do encounter weird screen tearing while player cyberpunk 2077. My monitor is 144hz but I still receive screen tearing when the game runs at 80-90fps
 
I have no idea how to switch to 4:4:4 YCbCr even after googling. But the current setting is good so far anyway. However, I do encounter weird screen tearing while player cyberpunk 2077. My monitor is 144hz but I still receive screen tearing when the game runs at 80-90fps
Unless you are running some sort of anti-tearing technology (GSync, Free Sync, or good ol' vsync) you will see tearing when you have a low FPS though (opposite of spike). This is normal.