Can I run an Nvidia geforce card and a weaker quadro card in the same PC system?

CaseyX

Reputable
Apr 2, 2015
14
0
4,510
I need to display 10-bit color for color grading and color applocations, but can t spend thousands of dollars on a Quadro card.

I want to use the 980ti as a computing card, and the Quadro for the GUI, since it can do 10-bit (is that even a good idea?)

I ll be doing 2K work mostly, with the occasional 4K video clip.

Some gaming shouldn t hurt, so can I set up my system like that and will it work?

I"m going to be using Davinci resolve mailnly, various Adobe and AVID applications, Blender and Davinci resolve amongst others.

To add, what screen is recommended? I'd like an ASUS PB287Q or LG 31MU97 (how accurate are those screens?) along with an HP DreamColor z24x. The HP is my 1st buy choice.
 
Solution
If what you're doing is displayed on the monitor connected to that graphics card, then yes. If you're trying to use both cards in conjunction with each other for one purpose, aside from using one card for gaming and another as a PhysX card, then no. Use of two independent cards is best done with separate monitors connected to each card.
If what you're doing is displayed on the monitor connected to that graphics card, then yes. If you're trying to use both cards in conjunction with each other for one purpose, aside from using one card for gaming and another as a PhysX card, then no. Use of two independent cards is best done with separate monitors connected to each card.
 
Solution
I'm new to this, so am I correct in understanding this:

I can use the geforce card to do all the processing etc, the system makes use of the geforce cudas, ram etc. while the quadro takes the GUI load off the geforce while giving me 10-bit color? And this will work without fail for applications like davinci resolve and adobe, which demands cuda cores and video memory. And having a quadro k420 won't affect performance negatively for both work and gaming? Is my understanding correct?

Could you please substantiate that, as this will be an expensive system and I can't afford anything to not work according to my needs?
 
No, that's incorrect so far as I know. The cards CANNOT be used TOGETHER, as in, both processing the same load, to achieve ANYTHING, unless they are cards that are in SLI or one is being used strictly for PhysX in games. Otherwise, you CAN use two cards together in one machine, but for separate tasks. You can't use the GeForce card to process video and the Quadro for 10 bit processing. It doesn't work that way to the best of my knowledge. I'm going to check into this further, but I'm fairly certain that what I've stated is accurate. If I'm wrong, I'll be glad to "eat my hat", so to speak.
 
Thank you so much! I've been asking around and I've been hearing and reading different things. My PC is dead so I can't do research as fast or adequate as I'd like to.
 


That's awesome darkbreeze! Thanks!!
 
The short answer is no you can't. Each requires a different software driver package from nVidia and they are exclusive to each other.

Long answer, there is nothing technically stopping those two cards from interacting. If your savvy with the internals of windows you can load one driver package and then manually install / integrate the others drivers into Windows. The side effect of this is that you will only have one nVidia control panel and one set of nVidia profiles, also there could be some very erratic / random behaviors as whichever driver suite is loaded might attempt to load profiles whenever the second card is used.

My recommendation is to just pick one type of card and go with that.

:Edit:

Ok did some more searching and there might be a hack like work around. The Geforce and Quadro cards both tend to use the same hardware chip with various feature flags enabled / disabled either in the chip or via drivers. This means, if you can fool the system with hardware ID strings, you can sometimes load one set of drivers for the other. This is by no means a supporting configuration as some advanced features might not work right, but I've done this before with older Quadro cards. Install Rivatuner and one of it's options is to change the HW ID that a card reports as, tell the Geforce to report as a Quadro of the same chip line and the drivers should install with the appropriate features. I've haven't tried this with a Maxwell line card, but it worked with previous lines. The Quadra equivalent of a Maxwell (GM200) Geforce is the Quadro M6000.
 
Super thanks palladin9479. Am I correct in understanding that by doing that I should get 10-bit color in OpenGL/OpenCL with a geforce?

I'm not exactly 'savvy' but I can learn :) Where can I get more info on this? I can't afford an eff-up.

Btw, Quadro is too damn expensive so that is out of the question! I'm still a student.
 
How can I enable 10-bit per color support (30-bit color) in my Geforce graphics card in programs such as Adobe Premiere Pro / Adobe Photoshop?



NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. A small number of monitors support 10-bit per color with Quadro graphics cards over DVI.



Not sure if that's helpful, but it does seem to indicate 10 bit support on GeForce cards.
 
I found something! It doesn't specify 10-bit color or not, but it does help to some extent. Check out pages 27-28 and 32-33. What do you think?

I could start moving onto my other apps. Otherwise, this geforce and quadro configuration shouldn't affect gaming, but more importantly only aid in my adobe and other apps - that is the question which remains!

http://www.google.co.za/url?q=http://documents.blackmagicdesign.com/DaVinciResolve/DaVinci_Resolve_Windows_Config_Guide_Oct_2013.pdf&sa=U&ved=0CA8QFjACahUKEwiR173T4JTGAhWGadsKHfqXAM4&usg=AFQjCNG3oyY-2E39QczK9Sx-Naz8PSl4qA
 
Super thanks palladin9479. Am I correct in understanding that by doing that I should get 10-bit color in OpenGL/OpenCL with a geforce?

NVidia makes one line of chips and then depending on application will either disable features on the chip or only enable them in certain drivers. The GM200 (Maxwell) is the line of chips used for the Geforce Titan, 9x series and Quadro M6000. This means those chips are hardware compatible with each other, and often as the exact same chip put in different devices. You can load the driver for a Quadro onto a Geforce card, and Geforce drivers onto a Quadro card, as long as you can make MS Windows think it has a different hardware ID string, which is what rivatuner does. Both Geforce and Quadro cards support 10-bit color output on Display Port in hardware, but the drivers are such that only Quadro has support for 10-bit color in OpenGL, which is what you'd be wanting to use it in.