Question Enabling 10bit (8bit+FRC) on the BenQ GW2765HT?

May 9, 2024
14
1
15
Hi, I recently bought a new primary monitor (Koorui G10) that supports HDR so I turned my previous primary monitor (BenQ GW2765HT) into a secondary one.
I read on https://www.displayspecifications.com/en/model/052d6b and the manufacturer site that my old primary monitor supports (8bit+FRC) as well.
How do I enable it thought? I could only find this guide on the manufacturer's website https://www.benq.com/en-us/support/downloads-faq/faq/product/application/monitor-faq-kn-00046.html but I don't have a Quadro GPU and neither can I find the resolution file on their website. My monitor isn't even on the list of Applicable Models.
Can someone help, please?
 
I read on...the manufacturer site that my old primary monitor supports (8bit+FRC)
How do I enable it thought? I could only find this guide...My monitor isn't even on the list of Applicable Models.
Can someone help, please?
looks like their is a discrepancy in the BenQ description which is just copied by the specs site.

could just be that older model information has not been properly updated/carried over to their newer site info.
this display is over 10 years old.
 
May 9, 2024
14
1
15
looks like their is a discrepancy in the BenQ description which is just copied by the specs site.

could just be that older model information has not been properly updated/carried over to their newer site info.
this display is over 10 years old.
Well on the official product site, it also claims that it supports 10-bit: https://www.benq.eu/de-de/eol-archives/monitors/gw2765ht.html
The site is in German but you should be able to translate it with Google Translate or some other tool.
The German guide for enabling 10-bit by BenQ only features just one monitor on the list of Applicable Models, so I assume it's also incomplete in other languages.
I was looking for the manual of the monitor because I wanted to enable autodetect of input cause I keep the monitor connected to my gaming PC using Displayport and plug in my laptop through an HDMI cable whenever my Gaming PC is off so that I have a second display for productivity.
I know that the display is pretty old, but even though it's terrible for gaming nowadays with a 4ms response time and just 60 Hz it's still a large 1440p monitor with good color reproduction so it's perfect as a second monitor. I wanted to turn on 10-bit so that pictures that aren't HDR but still use the 10-bit colorspace look the same on both monitors.
 

NedSmelly

Prominent
Feb 11, 2024
569
303
770
Are you running Nvidia studio drivers?

There should be an option in Nvidia Control Panel. Change resolution > apply the following settings > use Nvidia color settings > output color depth

Check windows settings > display > advanced after this

Also try a newer HDMI 2.1 or DisplayPort 1.4 compliant cable - an older spec one might be limiting bandwidth
 
Last edited:
May 9, 2024
14
1
15
Are you running Nvidia studio drivers?

There should be an option in Nvidia Control Panel. Change resolution > apply the following settings > use Nvidia color settings > output color depth

Check windows settings > display > advanced after this

Also try a newer HDMI 2.1 or DisplayPort 1.4 compliant cable - an older spec one might be limiting bandwidth
I did download the Nvidia studio drivers but even then the output color depth in the control panel only showed 8bit as an option.
Connecting the cable to my new monitor which supports full 10-bit up to 165 Hz and 8-bit+FRC up to 240 Hz at 1440p and 4K up to 108 Hz didn't help either.
Since the display is pretty old it only supports HDMI 1.4 and Displayport 1.2 anyway, but they should have been aware of the limitations of those standards.
A bit off-topic but my new monitor also shows 4K in the resolutions and lets me apply it.
Does that mean the panel is 4k even though I bought it as an 1440p display?
My PC mostly can't handle that resolution for gaming, but for watching content it would be nice to enable 4k.
 
I did download the Nvidia studio drivers but even then the output color depth in the control panel only showed 8bit as an option.
Connecting the cable to my new monitor which supports full 10-bit up to 165 Hz and 8-bit+FRC up to 240 Hz at 1440p and 4K up to 108 Hz didn't help either.
Since the display is pretty old it only supports HDMI 1.4 and Displayport 1.2 anyway, but they should have been aware of the limitations of those standards.
A bit off-topic but my new monitor also shows 4K in the resolutions and lets me apply it.
Does that mean the panel is 4k even though I bought it as an 1440p display?
My PC mostly can't handle that resolution for gaming, but for watching content it would be nice to enable 4k.
It likely won't be supported over HDMI, you'll need to use DisplayPort if you haven't already checked with that.

Regarding the 4K option:
https://linustechtips.com/topic/156...resolution-on-1440p-monitor/#comment-16366337
 
May 9, 2024
14
1
15
It likely won't be supported over HDMI, you'll need to use DisplayPort if you haven't already checked with that.

Regarding the 4K option:
https://linustechtips.com/topic/156...resolution-on-1440p-monitor/#comment-16366337
Yeah, I only connected it through Displayport to my main PC as I use the HDMI Port to connect my Laptop which doesn't even have a GPU and only has a 8-bit 1080p panel.
Does the 4K option for console compatibility only apply to HDMI?
Because I also get that option when connected through DisplayPort. The monitor came with a DisplayPort cable and I used that to connect it and to test if my old monitor was bandwidth-limited because of the previous cable. While I don't know which cable they used 1440p at 240Hz at 8-bit is more than what Displayport 1.2 is capable of so the cable should be at least 1.3.
Is there any way to check if the monitor truely supports 4k or if this is just for console compatibility?
The company that made it Koorui is pretty new being founded in 2021 and therfore doesn't have a lot of info, but that also made their monitors pretty cheap. I got this 27 inch 1440p 240Hz HDR1000 VRR MiniLED VA monitor for just 270€. They might even lose money on those displays and maybe its cheaper for them to sell the same panel under different models.
 

NedSmelly

Prominent
Feb 11, 2024
569
303
770
I'm getting mixed messages from Benq websites after searching for the specs of your monitor. The US English website and product sheet has no mention of 10-bit.

At the end of the day, I don't think 'fake 10-bit' (8-bit+FRC) is really worth the bother. I have two Benq high end monitors (BL2711 and SW271C) that I use for professional photo editing, have been calibrated using an X-Rite puck sensor, and which both support 8-bit+FRC. I compared them to standard 8-bit mode in SDR using the Photoshop Gradient step test. It honestly made no qualitative difference to the visuals.
I wanted to turn on 10-bit so that pictures that aren't HDR but still use the 10-bit colorspace look the same on both monitors.
10-bit is not a colour space. It is the number of 'gradations' within a colour space, i.e. bit depth. sRGB, AdobeRGB, DCI-P3, and Rec.709 are colour spaces. To make 2 different monitors match in colour representation you need to calibrate the output to a matching colour space (e.g. both monitors set to sRGB), regardless of bit depth. The most accurate way to do this is with a hardware calibration device, such as a Calibrite/X-Rite or Datacolor Spyder.
 
Last edited:
May 9, 2024
14
1
15
I'm getting mixed messages from Benq websites after searching for the specs of your monitor. The US English website and product sheet has no mention of 10-bit.

At the end of the day, I don't think 'fake 10-bit' (8-bit+FRC) is really worth the bother. I have two Benq high end monitors (BL2711 and SW271C) that I use for professional photo editing, have been calibrated using an X-Rite puck sensor, and which both support 8-bit+FRC. I compared them to standard 8-bit mode in SDR using the Photoshop Gradient step test. It honestly made no qualitative difference to the visuals.

10-bit is not a colour space. It is the number of 'gradations' within a colour space, i.e. bit depth. sRGB, AdobeRGB, DCI-P3, and Rec.709 are colour spaces. To make 2 different monitors match in colour representation you need to calibrate the output to a matching colour space (e.g. both monitors set to sRGB), regardless of bit depth. The most accurate way to do this is with a hardware calibration device, such as a Calibrite/X-Rite or Datacolor Spyder.
Well, the product sheet you sent says under display colors that it has 1.07 billion which is equal to the number for 10-bit.
I know that 10-bit isn't a color space and that it just refers to the color gradient, but if one monitor is displaying a color between the ones of an 8-bit monitor then it will look slightly off cause the 8-bit monitor can't display it.
According to BenQ the difference between real 10-bit and fake 10-bit is almost indistinguishable to most people.
That's why I rather run the Koorui Monitor at 240Hz with 8-bit+FTC than 165Hz with real 10-bit.
I could test it if I got the BenQ monitor to run either fake or real 10-bit.
I was going to calibrate the monitors with an X-Rite i1 Display Pro since I can just rent them for one day for 15€ where I live but I was thinking that the monitors have to be at the same color bit depth for that.
 
Yeah, I only connected it through Displayport to my main PC as I use the HDMI Port to connect my Laptop which doesn't even have a GPU and only has a 8-bit 1080p panel.
Does the 4K option for console compatibility only apply to HDMI?
Because I also get that option when connected through DisplayPort. The monitor came with a DisplayPort cable and I used that to connect it and to test if my old monitor was bandwidth-limited because of the previous cable. While I don't know which cable they used 1440p at 240Hz at 8-bit is more than what Displayport 1.2 is capable of so the cable should be at least 1.3.
Is there any way to check if the monitor truely supports 4k or if this is just for console compatibility?
The company that made it Koorui is pretty new being founded in 2021 and therfore doesn't have a lot of info, but that also made their monitors pretty cheap. I got this 27 inch 1440p 240Hz HDR1000 VRR MiniLED VA monitor for just 270€. They might even lose money on those displays and maybe its cheaper for them to sell the same panel under different models.
It's up to the manufacturer. There's no reason it can't also be done on DisplayPort. Typically it's only implemented on HDMI, since consoles don't have DisplayPort. But newer off-brand companies may not understand the rationale and just see that other companies are doing it, so they replicate the same feature.

Not sure what you mean by "truly" supports 4K. It truly supports accepting a 4K signal. If you mean, does it physically have 3840 pixels instead of 2560 along the bottom, no, otherwise it wouls juat be sold as a 4K monitor, not 1440p.
Well, the product sheet you sent says under display colors that it has 1.07 billion which is equal to the number for 10-bit.
I know that 10-bit isn't a color space and that it just refers to the color gradient, but if one monitor is displaying a color between the ones of an 8-bit monitor then it will look slightly off cause the 8-bit monitor can't display it.
According to BenQ the difference between real 10-bit and fake 10-bit is almost indistinguishable to most people.
That's why I rather run the Koorui Monitor at 240Hz with 8-bit+FTC than 165Hz with real 10-bit.
I could test it if I got the BenQ monitor to run either fake or real 10-bit.
I was going to calibrate the monitors with an X-Rite i1 Display Pro since I can just rent them for one day for 15€ where I live but I was thinking that the monitors have to be at the same color bit depth for that.
The FRC process is something performed inside the monitor. It receives a true 10 bpc signal from the GPU and then uses FRC to emulate the requested image on its 8 bpc panel.

So you can't run a higher refresh rate by using 8 bpc + FRC compared to 10 bpc. It's the same 10 bpc signal in both cases, in fact the PC does not actually know FRC is being performed. It just sees a monitor that says it can take a 10 bpc signal, so it sends it one.

What you are doing is choosing between is 240 Hz with 8 bpc, or 165 Hz with 8 bpc + FRC.
 
May 9, 2024
14
1
15
It's up to the manufacturer. There's no reason it can't also be done on DisplayPort. Typically it's only implemented on HDMI, since consoles don't have DisplayPort. But newer off-brand companies may not understand the rationale and just see that other companies are doing it, so they replicate the same feature.

Not sure what you mean by "truly" supports 4K. It truly supports accepting a 4K signal. If you mean, does it physically have 3840 pixels instead of 2560 along the bottom, no, otherwise it wouls juat be sold as a 4K monitor, not 1440p.

The FRC process is something performed inside the monitor. It receives a true 10 bpc signal from the GPU and then uses FRC to emulate the requested image on its 8 bpc panel.

So you can't run a higher refresh rate by using 8 bpc + FRC compared to 10 bpc. It's the same 10 bpc signal in both cases, in fact the PC does not actually know FRC is being performed. It just sees a monitor that says it can take a 10 bpc signal, so it sends it one.

What you are doing is choosing between is 240 Hz with 8 bpc, or 165 Hz with 8 bpc + FRC.
Makes sense since they are a relatively new Chinese company, in that case, there is no point in checking. Yeah, I meant if it does have 3840 pixels.
Well, the Chinese monitor supports true 10-bit (I think) but since it doesn't have DSC when I set the refresh rate above 165 Hz with HDR enabled then Windows shows 8-bit + dithering instead of 10-bit. I thought that stands for FRC.
 
Makes sense since they are a relatively new Chinese company, in that case, there is no point in checking. Yeah, I meant if it does have 3840 pixels.
Well, the Chinese monitor supports true 10-bit (I think) but since it doesn't have DSC when I set the refresh rate above 165 Hz with HDR enabled then Windows shows 8-bit + dithering instead of 10-bit. I thought that stands for FRC.
It does. In that case, it's being done on the OS side, which is a relatively new development and is enabled automatically when needed for HDR.
 
  • Like
Reactions: NinjaEule