4k 10bit a farce?
Here is the challenge, actually getting 4k 10bit after buying it, out of the monitor.
From what I am understanding is that 10bit 4k is really not being delivered from a GPU...Or in my case an Nvidia Quadro 4000, that does indeed deliver 10bit, but in 2560x1600 as long as I have the right cable as well. http://www.nvidia.com/object/product-quadro-4000-us.html
My question is, those who are buying these $500+ easy monitors and plugging them into a $500+ GPU's please explain to me:
How I will get 10bit 4k out of the GPU's that are out there? Like this:
http://www.newegg.com/Product/Product.aspx?Item=N82E16824236399
As I see it I can have 10bit 2K with a $750.00 GPU, BUT I still do not have a 2.0 HDMI out from that card....? So in essence arn't I losing my 10bit due to subsampling of the HDMI out?
In this article (see Y'CbCr 4:2:0 subsampling graphics)
http://www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of
Here is the challenge, actually getting 4k 10bit after buying it, out of the monitor.
From what I am understanding is that 10bit 4k is really not being delivered from a GPU...Or in my case an Nvidia Quadro 4000, that does indeed deliver 10bit, but in 2560x1600 as long as I have the right cable as well. http://www.nvidia.com/object/product-quadro-4000-us.html
My question is, those who are buying these $500+ easy monitors and plugging them into a $500+ GPU's please explain to me:
How I will get 10bit 4k out of the GPU's that are out there? Like this:
http://www.newegg.com/Product/Product.aspx?Item=N82E16824236399
As I see it I can have 10bit 2K with a $750.00 GPU, BUT I still do not have a 2.0 HDMI out from that card....? So in essence arn't I losing my 10bit due to subsampling of the HDMI out?
In this article (see Y'CbCr 4:2:0 subsampling graphics)
http://www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of