True 10bit vs 8bit+FRC

TehPenguin

Honorable
May 12, 2016
711
0
11,060
I wish to buy either of the two: LG 34um88c-p or LG 34um88-p

The "C" one is about €100 cheaper, which is very appealing, but offers only 8bit+FRC colour coverage while the "non-C" one offers true 10bit colours.

I've asked LG about it and all they told me was that "Frame Rate Control Technology is something LG and other companies use to practically enhance the colour depth by 2 extra bits".

I could not find a comprehensive explanation for this which is why I'm asking here: can someone explain to me, in words a simple man can understand, what is the real difference between true 10bit and 8bit+FRC when both claim to be able to cover 1,073 Billion Colours?
 
Solution
I assume you know what polarizers and polarized light are? If not, google it. LCDs work with a polarizing crystal layer over a polarizing sheet. The direction of polarity of the crystal layer is controlled electronically (LCD = liquid crystal display). When its polarity is oriented in the same direction as the polarizing sheet's, all of the backlight is let through and the pixel is white (well, 50% since the polarizing sheet only lets 50% of the light through). When its polarity is oriented perpendicular to the sheet's, it blocks all the light and the pixel is black. When it's oriented somewhere in between 0 and 90 degrees, some of the backlight is let through and you get greys.

Unfortunately the level of control of the...
About that 8bit or 10bit color depth.
What is the main reason for you to buy the monitor?
Gaming? Photo/video editing?
If it is gaming only, simply go for the cheaper one without much thinking.
If it is for photo/video editing, go for the 10bit variant. The 10 bit variant is better/smoother on color gradation/transition e.g. those banding you see if you have picture of the sky with color gradation from white to dark blue.
 
FRC is a type of dithering to approximate the extra colors.

Basically it means you flash between two colors on opposite sides of the color you're trying to approximate. Then to the human eye it sort of blends together to the right color.

But it is still an imperfect solution. Certainly 8-bit+FRC is closer in quality to basic 8-bit than to real 10-bit color.

That said, 8-bit color alone is already pretty good, and FRC will generally improve on that a little further.
 

aylafan

Distinguished
Feb 24, 2006
540
1
19,165
You will probably need a professional graphics card like Quadro/Firepro to display true 10-bit color and not a consumer card like the GTX 1080. Read some cases that consumer cards can also display 10-bit color, but some say it's an emulation (8-bit + FRC). You might want to research this if you're going for the true 10-bit monitor.
 
I assume you know what polarizers and polarized light are? If not, google it. LCDs work with a polarizing crystal layer over a polarizing sheet. The direction of polarity of the crystal layer is controlled electronically (LCD = liquid crystal display). When its polarity is oriented in the same direction as the polarizing sheet's, all of the backlight is let through and the pixel is white (well, 50% since the polarizing sheet only lets 50% of the light through). When its polarity is oriented perpendicular to the sheet's, it blocks all the light and the pixel is black. When it's oriented somewhere in between 0 and 90 degrees, some of the backlight is let through and you get greys.

Unfortunately the level of control of the polarizing crystals isn't quite enough to give 1024 orientations between 0 and 90 degrees. So panels use FRC to create in-between shades.. (Older panels couldn't even do 256 orientations., leading to 6-bit + FRC) . Using 8bit + FRC as an example, you want to produce 1024 shades but the panel can only orient the crystals to produce 256 shades. e.g. Shade 0, shade 4, shade 8, etc. If the pixel is supposed to display shade 401, but the panel can only produce 400 and 404, what you do is rapidly shift the pixel between 400 and 404.

You show 400 75% of the time, and 404 25% of the time (hence the F in FRC - frequency). This produces the illusion of the pixel showing shade 401. If it were showing shade 400 and 404 50% of the time, it'd produce the illusion of shade 402. If it were showing shade 400 25% of the time and 404 75% of the time, that would be shade 403.

The 6-bit + FRC panels were a bit problematic. Your eye can distinguish roughly 256 shades of red and blue (more for green). So this flickering between 6-bit colors was visible, especially if you're sensitive to the flickering of fluorescent lights. The pixels would appear to swim a little, especially in your peripheral vision (which is more sensitive to changes in brightness) or if you were moving your eyes around the screen (so the different brightnesses fell onto different photoreceptors in your eye, and your brain could see that the image was changing).

But I would expect it to be less of a problem with 10-bit panels. As I said, it's mostly the green where you can discern different shades with 8-bit color, and even that is just barely. The main reason to use 10-bit panels is because most modern camera equipment can record 10-bit or more. When you quickly convert that down to 8-bit for display, it can create biases which show up as slight banding.

The other reason for 10-bit is if the monitor is displaying a larger color space than sRGB. For example, if you're display in the Adobe RGB color space (which is about 40% bigger than sRGB), that's effectively stretching the difference between each color by 40%. Now 8-bit color isn't enough and banding is easily visible, so you need 10-bit. However, it looks like both of the monitors you've listed are limited to sRGB. So this shouldn't be a factor.

Since the vast majority of images and movies available are encoded with 8-bit color, they would display the same on both monitors. Unless you're working with photos or video shot with 10-bit or higher color depth, there really isn't any reason to prefer true 10-bit over 8-bit + FRC. And even then most people would be hard-pressed to see the difference.
 
Solution

jsmjr

Prominent
Jul 1, 2017
1
0
510


I just wanted to chime in to mention that I've seen nothing in the literature to suggest that the "non-C" version of the 34UM88 has true 10-bit color. The only difference reported between the models is that the more expensive one has DisplayPort. There's no way LG put 10-bit in a monitor of the same model number and then only charged 100 extra for it. 10-bit would cost more like 300-500 more.
 

Kbswaff

Reputable
Jan 23, 2016
2
0
4,510
I just wanted to chime in to mention that I've seen nothing in the literature to suggest that the "non-C" version of the 34UM88 has true 10-bit color. The only difference reported between the models is that the more expensive one has DisplayPort. There's no way LG put 10-bit in a monitor of the same model number and then only charged 100 extra for it. 10-bit would cost more like 300-500 more.
[/quotemsg]


I agree with you there. According to www.displayspecifications.com both LG monitors offer 8bit + FRC and NOT true 10 bit color. It is marketing... in much the same way that 720p is advertised as HD (then 1080p is Full HD). It seems that most every consumer grade monitor is 8 bit + FRC.

Professional grade 10 bit Video Production Monitors on www.bhphotovideo.com cost $2,750 to $20,000.

By checking https://versus.com I was able to determine that the C means Curved. And the reasoning for the extra $100 on the Non-C is because it offers Thunderbolt (which is the fastest Video+Data connection available, kind of like a "super-USB"... with Thunderbolt, a monitor can be connected to another monitor directly, a MacBook Pro can be connected directly in 4k 60Hz, and a hard drive can be connected to the monitor with the highest data transfer rate available of any cable technology).

But I have to contradict you jamir. Both monitors have DisplayPort, but the LG 34um88-p has 1 DisplayPort and 2 Thunderbolt 2.0 while the 34um88c-p also has one DisplayPort but No Thunderbolt. And I believe another difference is that the 34um88-p has a Glossy screen while the 34um88c-p has a Matte Anti-Glare screen. Also the Curved version uses less Power Consumption at 56.7 Watts compared to 70W for the Non-Curved version.