4k 10bit a farce?

netcommercial

Distinguished
Feb 19, 2012
256
1
18,795
4k 10bit a farce?

Here is the challenge, actually getting 4k 10bit after buying it, out of the monitor.
From what I am understanding is that 10bit 4k is really not being delivered from a GPU...Or in my case an Nvidia Quadro 4000, that does indeed deliver 10bit, but in 2560x1600 as long as I have the right cable as well. http://www.nvidia.com/object/product-quadro-4000-us.html

My question is, those who are buying these $500+ easy monitors and plugging them into a $500+ GPU's please explain to me:

How I will get 10bit 4k out of the GPU's that are out there? Like this:
http://www.newegg.com/Product/Product.aspx?Item=N82E16824236399

As I see it I can have 10bit 2K with a $750.00 GPU, BUT I still do not have a 2.0 HDMI out from that card....? So in essence arn't I losing my 10bit due to subsampling of the HDMI out?
In this article (see Y'CbCr 4:2:0 subsampling graphics)
http://www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of

 
The price of a gpu isn't a sign that the gpu is better in feature set nor performance. Your gpu is an old fermi card so is an older version of dp, 1.1a, which can do still do 4k/30hz with 10bit. You don't have hdmi but dp is better and is what you want to use. The article you linked to is talking about old kepler cards which have hdmi 1.4a but it does support 4k/60 with dp 1.2. The current gen, maxwell, supports hdmi 2.0 and dp 1.2 that both support it.
 
Suzuki - that was point and was looking for how they call it 10bit and really it is an upscaled 8bit So even if you do indeed have a 10bit card (which I do) good luck finding a monitor (that is reasonable) as well as a transmission cable as k1114 explained.

@unksol check the link to the card...it states that it does indeed support 10bit.

So at this time it is kinda of a farce for those buying a 10bit 4k monitor and the card or the monitor for that matter is really not. Upscale or dither. Better off buying a 2k monitor in 8bit to actually get what one is paying for... (or at least in my application) which is motion graphic / color correction of movies, etc.

I was hoping to be straightened out that, that ASUS is the real deal and by using Display port from my card I am good. I know price of card is not an indicator of 10bit and that is really the question about bits and not k as in 4k that seems to be the residual for 10bit is why. I do not care about 4K as much as color gamut, I wanted or hoped to find a monitor in the 400 range that has 10bit via Display Port.


Love this place, thanks
NC
 
That budget is way too low for a true 10 bit monitor. The cheapest one I saw was $800. That issue is the monitor specs though. Get an actual 10 bit monitor and you get actual 10 bit. I don't know what you mean by 4k seems to be residual. The connections are just being limited by their bandwidth so can't give output at 4k unless you lower hz. It is just too much info and the old connections do not have enough bandwidth to transfer that much data for that res.

So this is just an issue with marketing with the monitor not the gpu. But it's common for monitor's to fake lots of their specs, there is no regulation. Like response times are usually best possible, not what you'd normally get. Contrast ratio is just nonsense with dynamic. Even standard might still say 1000:1 when it's not. Plenty of marketing "tech" for better color and stuff that doesn't actually do anything better. Bit is usually with frc.