Question 10bit color support on the RTX 3000 cards.

Jan 4, 2025
10
0
10
Hello,
I have a laptop with Quadro P4200 (Pascal) and I connected an external card RTX 3090 via Thunderbolt.
I'm using Geforce driver for the both cards.
And I have 10bit SDR display. It is detected fine on the both cards and 10bit color mode is available.
When I tested the 10bit output with https://displayhdr.org/downloads/ VESA DisplayHDR CTS 1.2 tool, I found out that 10bit output on RTX 3090 is fake.

I'm planning to use 3090 as default. How can I fix it?

Thanks!
 
I have an RTX 4080 Super and a Dell 2721D monitor. I have the refresh rate set to 144 Hz and at this setting I get 12 bit according to the Vesa test. If I increase the refresh rate it drops down to 8 bit. The monitor manufacturer does not write about this in the user manual... I have 144 Hz set in Windows in the advanced display settings section and here I also see 12 bit written. Try lowering the refresh rate if that helps.
 
What do you mean fake 10bit? Here's my output from the tool with 3080Ti. Can you provide the output so we can see what's going on?

f3544d607a504863b76060ab77d89f3d.png
 
There you go, seems fine to me. Obvious difference between 8 bit and 10 bit here.

83f4f85e757bc8d9e7cf64f1e2c7de16.png


Now the question to you is what resolution and refresh rate you're driving. Thunderbolt has maximum of 20/40gbps, it might simply not be enough to output 10 bit at your desired resolution and refresh rate.

All Nvidia GPUs of last like 4 generations support true 10 bit output, including 3090. The only limitation is connection bandwidth and display capabilities. Since display is capable in your case, then the probable issue is Thunderbolt connection not having enough bandwidth to drive that color level at your refresh rate and resolution.
 
Last edited:
Can you make a photo on your phone? Photography solves the problem of display on the viewer's side because it uses the entire dynamic range of the camera.

No, I'm using direct HDMI connection, my thunderbolt in the PCI-e x4 Gen 3 mode and can't impact.
 
Can you make a photo on your phone? Photography solves the problem of display on the viewer's side because it uses the entire dynamic range of the camera.

No, I'm using direct HDMI connection, my thunderbolt in the PCI-e x4 Gen 3 mode and can't impact.
I assure you I see exactly same thing with camera.

Again, you say "can't" impact. But I'm telling you it does not matter what mode is your thunderbolt is - it has a lot of bandwidth limitations that can cause just that even down to the cable you use. You need to verify what version of Thunderbolt you actually have, and what the cable you use is rated at.

Here's a photo I took with my S24 Ultra which has camera as good as it gets. 3080Ti, which is practically same thing as 3090, but slightly cut down.

b8ca013becc45c0953303e9f78958963.jpg
 
The PСI-e link affects a little differently. For example, on old video cards (Radeon R9 290) to save energy in the desktop, the link was cut to x1 gen1, in newer cards the full width remains, but BW per one pair is reduced to gen1. The same it works for me through Thunderbolt. The link speed in the desktop is x4 gen1 (GPU-Z).
My monitor is 1080p 60Hz, 10bit, SDR = the peak HDMI BW ~ 4.5Gbit.
PCI-e x4 Gen 1 = 10Gbit
PCI-e x4 Gen 3 = 40Gbit
However, I agree that this is a debatable issue.

I believe the main difference between our setups is that you have an HDR monitor, mine is SDR.
HDR implementation is impossible without 10+ bits per pixel, another question is how the RTX driver handles SDR displays.

For example, at your photo I can clearly see that you have a 10+ bit display, but the screenshot above in the browser is displayed as 8 bit for me, even with Quadro. I use Google Chrome browser.

https://drive.google.com/file/d/1yhCY_1Nh3Vya049VUoHMzbouS9dsOF9T/view?usp=sharing

Thank you for taking the time to take a photo. Is there a way to disable HDR but keep 10bit and make sure that the VESA utility will correctly display the 10bit gradient?
 
Last edited:
I have switched to SDR mode, here are the results.

3bf6c3b0b7994df53162fe79da5c744c.png


5c6b794e4daefd8689a95230fad5cfaa.jpg

I did accidentally cut off the tooltips when I took a photo, but well you can infer.

Anyway, even in SDR mode it seems to be correct.

Somehow there is something on you end. Given your resolution/refresh rate - you should have no bandwidth issue whatsoever, which probably means that some sort of setting somewhere messes it up.

Do you have Advanced Color turned off in Windows by chance? Try to turn it on and see, and if still there is an issue then nuke drivers and tools with DDU and reinstall them. Kind of shooting in the dark here, but it is what it is - the card should be capable of doing what you need, unless there is some sort of weird compatibility issue with the way it's connected.
 
Thank you!

I'll try reinstalling the driver and maybe something else...
Can I ask, does your screenshot display correctly in your browser? I mean, does it match the photo or does it still look 8-bit?
 
Thank you!

I'll try reinstalling the driver and maybe something else...
Can I ask, does your screenshot display correctly in your browser? I mean, does it match the photo or does it still look 8-bit?
Not 100% sure what you meant, but yeah - the screenshot I see in the browser does appear to be what I have seen directly from the app in all cases. There is always a clear difference between 8-bit and 10-bit parts for me, as you see in the images above whether directly from the app, or looking at the screenshot in the browser.
 
To me the screenshot looks 8 bit. This means that the Display Native gradient matches the 8 bit quantization.
Are you using Google Chrome?
 
Here's some better zoomed in example of what I see in a screenshot:

f91a98f65413fa14b4b9af6a98301ea1.png


In the 8 bit part you have minor steps between gradient of greys, and then you have a major step - I differentiated it with colored line. As in lets say within the the blue line area for 8-bit there are 7 minor steps, and the whole blue area appeats as one major shift from the gradient that red area represented.

The Native and 10 bit parts don't have these obvious big steps. If you try hard to see them, you can sort of "feel" they are there, but 8 bit is absolutely obvious - you don't need to squint or zoom in to see, while 10 bit major steps are barely visible to the point I'm not sure if they are there or I'm imagining things.

The smaller steps are present in all 3, but in the 10 bit and native the steps appear shorter in length and the large steps are nearly indistinguishable at least in the darker part of the image.

That said, I don't think this screenshot is actually true 10 bit it shows difference between 8 bit and 10 bit lines, but I don't think it's actually as good as what you see with your eyes/camera. So, camera method is still better.
 
No access to the link. A regular photo would be enough. They clearly show the interpretation due to the expansion of the brightness range that the camera makes.

Yes, it would be great if you also make a photo of your post with a screenshot in SDR mode.
 
No access to the link. A regular photo would be enough. They clearly show the interpretation due to the expansion of the brightness range that the camera makes.

Yes, it would be great if you also make a photo of your post with a screenshot in SDR mode.
You should be able to access the link now.

This is the compressed screenshot version

7aecdf5d2a15aed0d73ab5d77d0467f8.png