Graphics card for my new display?

VOiDCS

Reputable
Sep 13, 2015
18
0
4,510
I recently purchased this monitor, with a 99% Adobe RGB Color Gamut. Which graphics cards will be able to display all colors onto this monitor? Would any nVidia card work? It mentions 14-Bit 3D LUT & Delta E≤2. Thanks!

Edit: forgot to link the monitor
 
Solution

The video drivers have blend shadows, dynamic range, dynamic contrast enhancement, and Color enhancement just to name a few.
The 1060 6G is a very good card and well worth the price.
I ran into the same questions when I got a good quality monitor a while back.
"most" newer nvidia cards will display 10 bit color (or 2billion+ colors). Check the nvidia site for your specific card to make sure.

There are also sites where you can get test images to see what color output you actually have -- although I found many of them require you to have the commercial version of photoshop (which I do not).

And lastly ... your eyes may be able to tell the difference when it comes to displaying 8bit vs 10bit vs 14bit color
 

I forgot to link the monitor in the original post but here
it is. How many bits of colors is it? And you are saying basically any nvidia card can run it?
 

So something like a gtx 1060 would be able to display the colors from this monitor? I wasn't able to find anything about color bit depth on the nvidia site.
 

Yup 12 bit no problem.
We need better pixels
And we do, not necessarily drastically so, but this type of investment in the future of visual computing is a welcome one. If we can get past the typical assumption of what HDR is, then you can hopefully grasp what this means for gaming. This won’t be as dramatic a difference as moving from 16-bit internally rendered color to 32-bit, but it’ll still be quite the difference when combined with increased luminance ranges and contrast ratios. Of course, it all has to be used right to make a drastic difference.
The display controller in the GTX 1080 can handle 12-bits-per-pixel color and supports advanced color modes, including BT.2020 and SMPTE 2084 color quantization. Standards for Ultra HD displays, including PQ10 from the Ultra HD forum, use a combination of BT.2020, SMTE 2084, and 10-bit already exist and Ultra HD TVs using the new HDR standards are already in the pipeline. PC monitors utilizing these standards will likely arrive by early 2017. Pascal also supports HEVC 10/12-bit encode and decode.
 

hm, that quote is about a 1080, it applies to a card like a 1060 aswell? And how many bits of color is the monitor I linked? Thanks!
 

Yup but also know the video outputs are important so look for HDMI 2.0b as not all have the newest.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125913
 

Alright, but you said 10 bit. Is the monitor I showed 10 bit or 14?
 


Ah, so the monitor goes to 14 bit but the card is only 10? What type of software would make up for that?
 

The video drivers have blend shadows, dynamic range, dynamic contrast enhancement, and Color enhancement just to name a few.
The 1060 6G is a very good card and well worth the price.
 
Solution

Great. Thanks for everything!