nVidia GeForce LUT

Gregow

Honorable
Oct 11, 2014
22
0
10,510
I've been trying to find an answer to this, but there's seems to be very little information out there and different theories...

The question itself is rather simple:

If I set my GTX 970 to 10 bits per channel on DisplayPort, or 12 bits per channel on HDMI, do I get an 8 bit LUT with 10 or 12 bit precision? And, is there any dithering involved?

Let's assume we have an 8 bit display (with 10 bit LUT, if that should matter). So, no 10 bit output from the display (that's another ball game, and I believe nVidia has enabled 10-bit output for some applications on the GeForce cards).

I've seen some claims that nVidia is lagging way behind AMD here, by running 8 bits without dithering. With the current drivers however it seems GeForce cards can get 10 or 12 bit precision through DisplayPort or HDMI, though there's still no dithering used. I don't know how reliable that information is though.

Maybe someone here knows how it works?
 

Gregow

Honorable
Oct 11, 2014
22
0
10,510
That's a whole different thing. I'm not talking about 10 bit output from the display, but the precision of the LUT.
The LUT can be 8 bit calculated with 10 bit precision. Meaning, it calculates 1024 levels that gets mapped to the 256 level output of the graphics card. Similarly many displays have a 10 (or more) bit internal LUT that gives an 8 bit output. This reduces banding and posterization.