How do I enable 10bit color on my Asus PB278Q 4K Monitor?

DukeOvilla

Honorable
Apr 23, 2013
316
0
10,790
Here is my monitor, it's supposed to be 10bit.
http://www.newegg.com/Product/Product.aspx?Item=9SIA24G1ZP5925&cm_re=asus_4k_monitor-_-24-236-399-_-Product
YES, I have it plugged in with a display-port cable.

When i try to enable 10bit color in the Nvidia Control Panel for my GTX 980ti It shows only 8bit color, no 10bit option like there SHOULD be.
3d4441018e7e034cce092661532278dd.png


NO, You to NOT need a Quadro card, I've seen people qwith 970s enable 10bit color.

How do I enable 10bit color, thanks.
 


I'm using the Display Port cable the monitor came with. In the monitor's console it say it's using DP 1.2, 1.2 supports 60hz and the 1.1 option did not.

It's an ASUS PB287Q plugged into one of the display port slots on my GTX 980ti.
 
Dithering 10 bit, it's a native 8 bit.

[strike]There appears to be 2 versions of the Asus PB278Q, a 2560x1440 and 3840x2160. However,[/strike] if it's really a true 10 bit display, which I highly doubt, then contact ASUS. A few graphics card can supposedly deliver 10 bit, but I believe this feature has been locked so that you pay more for the workstation grade cards. What you are using is very likely a 8 bit panel but it's faking 10 bit. Better than 8 bit, worse than 10 bit.
 

Nope. It ditheres unless it's receiveing 10bit inputin which case it will display 10bit color.

The specifications say

Display Colors : 1073.7M (10bit)
"Real 10-bit colors means the ASUS PB287Q provides smooth color gradations for a more natural-looking transition between hues. The PB287Q also delivers an impressive 1ms (gray-to-gray) fast response time and a 60Hz* refresh rate for fluid and responsive visuals needed to experience today's games to their fullest."

But yet it's an 8bit panel... that displays 1.07 billion colors? That's not how it works.

WTF.
False advertising?
 
your monitor is 287Q not 278Q as you put in your title.
according to asus website its Display Colors : 1073.7M (10bit)

i can use 10bpc on my gtx 980 ti so it's not a card issue.
what driver version are you on?
 


Is your's the same monitor?
 
This is what Tom's had to say in their review of the ASUS 287Q display:

"This brand-new part achieves 10-bit color by using 8-bits with FRC like many professional monitors we've reviewed. So even though most users will use an all-8-bit signal chain, it will accept 10-bit formats too. And all incoming 8-bit content is upconverted by the PB287Q. The end result is a palette of 1.07 billion colors."

So, it looks like your screen can handle 10-bit formats whether it's using tricks to get there or not, and your card can handle 10-bit formats too, but do NVidia's drivers realize your screen is capable of accepting 10-bit output?
 


That's the thing, it does not seem to. I'm not sure how to fix it... perhaps a fresh install of the drivers? I was using a standard 1080p monitor when I did the driver installation initially.
 
I would honestly not worry about it. To truly take advantage of any additional color space, you need an entire 10-bit graphics path that is aware and can utilize it, start to finish, starting with 10-bit color capable software. Just telling your NVidia drivers to output a 10-bit signal doesn't mean any of the programs you're running are going to start feeding 10-bit graphics to the graphics card to take advantage of any supposed improvement in color space.

The other thing to take away from the Tom's article I linked to is, the screen up-scales any 8-bit color input anyway, so you may not see much improvement if you were using 10-bit software. So, the takeaway from that is, no matter the setting in your NVidia control panel, your 8-bit software is already looking better than it otherwise would on an 8-bit screen and color banding should already be minimized due to the color up-scaling of the screen, unless of course you're dealing with really bad source material, to which adding bits isn't going to fix.

Edit: One other point to make is, if you did manage to set your NVidia control panel to output 10-bit color, and as a result the screen then handled the signal without color up-scaling, your 8-bit software may actually end up looking worse.
 
It is a 8-bit panel with dithering A-FRC.Check EDID, it is advertising there it is able to accept only 8-bit per channel. Additional 2 bit are done in monitors firmware with dithering, thus you want get ever 10-bit in nvidia control panel with PB278Q. And that again show me a true 10BPC monitors under 2000$. Nada.
 
Hello better late then never.
10 bit will be achieved through the Display Port format 1.4 the display port format 1.4 is only now available through the newest Graphic cards NVidia GTX 10Series. Your card 980 is only DisplayPort 1.2 I confirm this via my sources as follow:
The Display support specs via Geforce for GTX card 980 ti shows displayPort 1.2 while newest 10 series cards are DisplayPort 1.4
Wiki DisplayPort to see the different versions update from 1.2 to 1.4. ps. there are also effects on HDMI 2.0b and HDCP 2.2 compliance.
 


How about the pb328q @1440p, 10bit and 12bit lookup table, for $700 and used 400?
But yeah, easier to post schmu instead of using google for 1min.
 
Gtx 970 here. It does show to me the 10bpc on the nvidia control settings. So... different from what it is said above, series 9 do output 10 bit on a true 10 bit panel. I use the display port cable that came with my monitor, so I don't know which version it is.