help picking the best chroma subsampling and color depth settings.

Tensai30

Respectable
Jul 4, 2016
281
0
1,810
I'm using a sharp 43n700u UHDTV monitor and connecting it to a Nvidia GTX 1070 through HDMI. At 2160p I have:
ycbcr420 8bpc, 12bpc
ycbcr422 8bpc, 10bpc, 12bpc
ycbcr444 8bpc
RGB limited and full
I can't decide which one to use. RGB full seems like the best option but ycbcr422 12bpc looks better to me. I'm seeing online that ycbcr444 would be the best choice but I only can select 8bpc.
 
Solution

What are you looking at when you say ybcr422 12bpc looks better?

Most TVs will assume a chroma subsampled image is video, and will enable video processing options like sharpening. Sharpening actually degrades image quality, but does it in a way which tricks your brain into thinking the image is sharper. So it may look better, but actually contains less detail.
https://en.wikipedia.org/wiki/Unsharp_masking

Putting it in RGB full mode should automatically turn off unnecessary processing like sharpening. This is the best mode for displaying desktop graphics. Putting it in one of the other modes with processing turned on...
That's the answer I was afraid of. I saw the same when searching on Google. For some reason, ycbcr420 and 422 on 12bpc looks better for me as the colors are noticeably better. Which goes against everything I keep seeing online so maybe I'm going crazy or something else is going on with my TV. How about Rgb full color? Wouldn't that be better to use than 444?
 

What are you looking at when you say ybcr422 12bpc looks better?

Most TVs will assume a chroma subsampled image is video, and will enable video processing options like sharpening. Sharpening actually degrades image quality, but does it in a way which tricks your brain into thinking the image is sharper. So it may look better, but actually contains less detail.
https://en.wikipedia.org/wiki/Unsharp_masking

Putting it in RGB full mode should automatically turn off unnecessary processing like sharpening. This is the best mode for displaying desktop graphics. Putting it in one of the other modes with processing turned on (as if it were a video image) may look better in games and videos, especially if the TV is some distance away. But for desktop work, you will notice the halos created by unsharp masking detracting from the image quality.

8-bit, 10-bit, and 12-bit color depth won't have any effect if your graphics card is only displaying 8-bit colors. (24-bit or 32-bit color mode in the Windows options - 16.7 million colors. 32-bit mode is 24-bit + an 8-bit alpha channel for transparency effects like Aero.) Only specialized graphics and video work needs more than 8-bit colors, particularly when high-gamut monitors (e.g. Adobe RGB color space) are being used. The vast majority of monitors and TVs are limited to sRGB. If what I just wrote is a meaningless gibberish to you, then you don't need more than 8bpc or RGB full.

RGB mode individually tells each pixel which red, green, and blue values to display. There is no room for picture processing. The TV just displays what the computer tells it to display. (RGB limited is a weird mode which clips the darkest blacks and brightest whites. I still haven't figured out why it even exists.) If the colors look better in a chroma subsampled mode, that's probably due to a color setting on your TV exaggerating the colors. This is not an uncommon reaction - people tend to like overly-bright, over-saturated, and over-sharpened images (at least for video). But RGB full mode will come closest to exactly what the computer is trying to display.

Edit: Been trying to find the color gamut of the TV model you listed without success. I did find one comment saying that that TV had a wider than normal gamut (which would indicate it exceeds 100% sRGB). If that's the case, that would explain why the colors look better in the 12bpc option - it is stretching the colors (making them more saturated) to cover the TV's wider color gamut. While this makes the image more colorful, it is less realistic (over-saturated), and you are not seeing the image as it's originally intended.
 
Solution
So basically it is this 4:4:4 will offer you 16.7 million colors and color data for all pixels
4:2:2 will Let you have a several billion colors but only half the horizontal color resolution. So more colors but less density. As pixels must be colored the same in 2 pixel groups horizontally.



 

Yeah I figured that using no chroma subsampling at all (RGB full) would be the best option and I kept it at that for quite a while in Nvidia control panel. I wasn't until recently when I started playing resident evil 7 which is a game that has HDR that I noticed ycrbcr420 12bpc looking much better. I then tried it on 4k movies and other desktop content and the colors seemed to look better too. At least I know I'm not going crazy. Thanks.