I've been trying to find an answer to this, but there's seems to be very little information out there and different theories...
The question itself is rather simple:
If I set my GTX 970 to 10 bits per channel on DisplayPort, or 12 bits per channel on HDMI, do I get an 8 bit LUT with 10 or 12 bit precision? And, is there any dithering involved?
Let's assume we have an 8 bit display (with 10 bit LUT, if that should matter). So, no 10 bit output from the display (that's another ball game, and I believe nVidia has enabled 10-bit output for some applications on the GeForce cards).
I've seen some claims that nVidia is lagging way behind AMD here, by running 8 bits without dithering. With the current drivers however it seems GeForce cards can get 10 or 12 bit precision through DisplayPort or HDMI, though there's still no dithering used. I don't know how reliable that information is though.
Maybe someone here knows how it works?
The question itself is rather simple:
If I set my GTX 970 to 10 bits per channel on DisplayPort, or 12 bits per channel on HDMI, do I get an 8 bit LUT with 10 or 12 bit precision? And, is there any dithering involved?
Let's assume we have an 8 bit display (with 10 bit LUT, if that should matter). So, no 10 bit output from the display (that's another ball game, and I believe nVidia has enabled 10-bit output for some applications on the GeForce cards).
I've seen some claims that nVidia is lagging way behind AMD here, by running 8 bits without dithering. With the current drivers however it seems GeForce cards can get 10 or 12 bit precision through DisplayPort or HDMI, though there's still no dithering used. I don't know how reliable that information is though.
Maybe someone here knows how it works?