Quadro vs GTX - Workstation 3ds Max (and other) Graphics Card Comparison Results: 980, 1080, Titan Z, K5000, M6000 etc.

creationsof12

Distinguished
Sep 11, 2011
65
1
18,635
UPDATE: Sorry everyone, I quit the job I was working at and apparently I had this sheet on my google account there. They deleted my account and this sheet went with it -_- sad day... and even though I thought I had backed it up to my other gmail account and made a copy of it, apparently it deleted that too. Way to go, google.

If anyone else out there managed to get a copy of it, please share it!



https://docs.google.com/spreadsheets/d/13afhWgHv6xqthlC1YIZhg7mDV6wdPunUaa3lqH8gMX4/edit?usp=sharing

Click the link above to view the charts, data and test results.

Update May 20th 2016: Specs from NVidia's new GTX 1080 and 1070 added. Real-world test results to come as soon as cards become available.

Update Jan. 23rd, 2016: Charts added showing updated bang for your buck, power for your buck, best for 3D software viewport performance (max, maya, etc) and GPU render times added. You can find all data at the bottom of the google sheet.

Jan 23rd, 2016 scores:
Bang for your buck (including VRAM): GTX 970
Power for your buck: GTX 970
Best for Display performance: GTX Titan X SC and Quadro M6000
Fastest GPU rendering: GTX Titan Z

Current conclusion: It depends on which software you are using. In 3ds Max, Maya, C4D, Blender and Adobe products, GTX cards are just as fast (and sometimes faster) than their sister Quadro cards depending on your viewport mode (shaded, wireframe, etc). Quadro cards may still have an advantage in some 2D and 3D software (none that I have seen so far but there's still a possibility). Latest GTX cards seem to work well in all viewport modes.

Note: Quadro cards tend to be faster with OpenGL and GTX cards tend to be faster with DirectX. Most 3D software will allow you to choose between the 2 to get better viewport performance depending on the type of card you have installed. HOWEVER: It's worth noting that some software will loose some functionality by switching to DirectX or OpenGL. Maya, for instance, will loose functionality in hair, fur and a few other things by switching to Direct X. Later GTX cards, however, seem to handle OpenGL quite well.

Note: If you are looking for the very best viewport performance and aren't on a tight budget, consider buying multiple high-end Quadro cards and then SLI them together for better viewport performance. BUT BE AWARE: this does not work in all machines! And it only works with certain Quadro cards. You can see a list of compatible cards and machines here: https://www.nvidia.com/object/quadro_sli_compatible_systems.html

I will be updating this chart as soon as new cards are released.
 

creationsof12

Distinguished
Sep 11, 2011
65
1
18,635
I'll try to keep this up to date as graphics cards get released. I'm also trying to keep the prices accurate so if you see a price that has changed, let me know.
 

creationsof12

Distinguished
Sep 11, 2011
65
1
18,635
Full list of cards so far (spec):
Quadro FX 3800
Quadro K2000
Quadro K2200
Quadro K4000
Quadro K4200
Quadro K5000
Quadro K5200
Quadro K6000
Quadro M4000
Quadro M5000
Quadro M6000
Tesla K20c
Tesla K80 (2-in-1)
GTX 680
GTX 750
GTX 770
GTX 780
GTX 780 TI
GTX 950
GTX 970
GTX 980 SC
GTX 980 Ti
GXT 1070
GTX 1080
GTX 1080 Ti (rumored specs)
Titan Black
Titan Black SC
Titan Z (2-in-1)
Titan X SC
 

Zifm0nster

Honorable
May 3, 2015
8
2
10,510


creationof12,

Could I throw a monkey wrench into your interesting work?
What about 30-bit color depth workflow?

I have a 10 bit depth capable monitor I use for photography editing. I am researching video cards that can provide this option.
Plus, the new release of Lightroom 6 from adobe now can take advantage of GPU's in the develop modual.
The cards mentioned above will satisfy solid performance for Lightroom 6, but I would like to knock out 10 bit color depth as well.

What say you?

-Mark


 

creationsof12

Distinguished
Sep 11, 2011
65
1
18,635
Zifm0nster,

Say I.... I think so but I'm not positive. I've done the research and from what I've found, all of these GTX cards - GTX 780 or higher at least - SHOULD support 10-bit monitors through HDMI or the Display Ports (DP) but.... I've asked NVidia this multiple times and they won't respond. Other people claim to have asked the same question but again, they don't respond. I've gotten responses to other questions but the conversation stops as soon as I throw this question in. It's like they don't want people to know or something. I'm not really sure.

But there's one sure way to find out: Test it yourself :D since you have a 10-bit monitor, purchase a GTX 780 or higher from newegg.com. If it doesn't work how you want it, you have 30 days to return it for a full refund. If your newegg order doen't include free shipping then maybe you can find some place that does. Most places have at least a 14-day refund policy. There are sevveral ways to test this I'm sure but one sure way is to see if there is a noticeable difference when you switch from HDMI or DM (Display Port - suggested) to DVI, as DVI doesn't support 10-bit (that I know of anyway).

I would test it myself but I don't have access to a 10-bit monitor anymore. Had one for my old machine but that had a Quadro card I think. And I would love to know the answer to this. Could put it into my scoring sheet as an added benefit of the GTX cards :)

There is a possibility that NVidia dumbs doen their consumer cards so that they still have sales in their Quadro section but the power and capability is there. If anything, it would be software locked.

It's worth noting too that many monitors that claim to be 10-bit are not TRULY 10-bit. They use interpulation to "convert" 8-bit to 10-bit so you could still use an 8-bit HDMI or DP for those. HP LP2480zx is one that IS a true 10-bit monitor I think.

 

Zifm0nster

Honorable
May 3, 2015
8
2
10,510
creationsof12,
Appreciate the response.

My digging is along the lines of what you mention.
I can find a pdf on 10 bit (from Nvidia). It says every card after a certain series can support it.
The pdf is a 2012 date - i believe.

But any updated info in the 2014-2015 time frame .... nothing. Nada.

That would make sense not to publicizes this little bit of information.
I have a friend who runs video editing for weddings. He moved over to a Geforce Titan card vs using a quadro card.
But he does not have a 10-bit monitor. All 3 are sRGB.

I am down to an AMD fire pro W7000 or a GeForce 970-980.

I can provide an update when I find out.

Monitor: Dell U2713h

-Mark
 

Geek2015

Reputable
Sep 20, 2015
66
0
4,660
Great job! Would be nice to add SPECwpc 1.2 and SPECViewperf 12 benchmark results to the list as well. As far as I know, Quadro K4000 should outperform all GTX cards in Wire frame test.
 

creationsof12

Distinguished
Sep 11, 2011
65
1
18,635


Nope. A lot of the GTX cards outshine the K4000. Of course, a lot of those GTX cards are much newer than the K4000. But yes, in general, Quadro cards get higher fps in wireframe... until you get to the GTX Titan X SC, which had very high fps in wireframe.... actually higher fps than it had in shaded. That's a first for a GTX card.
 

Andreas_R

Reputable
Jan 24, 2016
1
0
4,510


Can you provide a link to the test scene - I have a maxed out 2013 MacPro w. D700 I would like to test and compare. THanks
 

creationsof12

Distinguished
Sep 11, 2011
65
1
18,635
@Andreas_R

I can't give you all of the files because some of them belong to the place I work but I can give you a couple: 1 for general viewport fps and the other for GPU rendering using VRay 3.20.03. I just can't give you the ones with textures, company models, etc.

All scenes were tested in max 2014. If you don't have VRay 3.20.03 and 3ds Max 2014, you won't be able to directly compare to my results. I found that frame-rates in Max 2015 and 2016 were the same in 2014 for the scenes we were testing (with all viewport drivers) so we decided to do all tests in 2014... and actually, on average, they were slightly lower in 2015 especially.

NOTE: Because I am testing on company machines, we don't have the latest service pack installed for Max 2014. Our "product version" is 16.0 SP3. SP4 DOES bring viewport enhancements and if you have SP4 installed, you will most likely see slightly higher fps.

You can find both files here:

https://drive.google.com/folderview?id=0ByPUCbLM66XyRDhTeUNYTlBEOW8&usp=sharing

Tip for viewport fps testing: Tested in shaded mode, disable hardware shading, ambient occlusion and lighting (except for default lights). Set viewport drivers to the latest Nitrous version. In the viewport configuration settings, set the default lights to 1 instead of 2 and disable highlights.

Tip for GPU rendering: Leave settings as they are in the file. Use VRay RT. The first rendering you send will be a little bit slow so one way to kind of "resend" it is to adjust one of the settings in VRay RT (like the Ray bundle size) and put it back. You can also play with different ray bundle sizes to see which one gives you the fastest render times. It should be setup to stop the rendering as soon as it reaches it's max noise of 0.02. Make sure that "Show statistics" is checked so you can see the render time.

For both tests, use something like EVGA Precision X 16 (or similar software) to control your card's fan to keep it cool. I've been running GPU renders for years on an EVGA GTX 780 (got it when it first came out) without any problems. I just put my own fan curve on it to keep it pretty cool. Additionally, if the fan ever goes out (never has for any of my cards but still), it's really easy and inexpensive to replace. And thanks to software like precision X, it will tell you if there is ever a problem like that. I prefer high fan speeds over under-clocking the GPU's (for longer life).

Anyway, let me know what kind of fps and GPU render times your getting and send some screen grabs. Also, if it's not to much trouble, send me your machine and card specs as well. Would be really useful :)
 

japancakes

Reputable
Jan 6, 2016
4
0
4,510
Thanks a ton for your work and research, this is epic. I'm feeling really content that I got the GTX 980 Ti Amp! Extreme with the highest clock speed and excellent cooling. Going SLI was a question of mine and it only seems valuable in rendering so I'll strictly get it if my gaming habit gets out of control.

Thanks again.
 

creationsof12

Distinguished
Sep 11, 2011
65
1
18,635


Yeah. It seems to scale about the same. Though I should say that I've done much more extensive testing in Max, since that's the main software I work in.
 

creationsof12

Distinguished
Sep 11, 2011
65
1
18,635


Glad to hear that this might be helping some people out there! :)

I feel like there may be a little confusion here though - SLI is when you bridge multiple graphics cards together using an SLI bridge and enable it in the NVidia control panel. SLI only increases performance in video games, not pro software like 3ds max, maya, etc. The viewport in max will only use 1 graphics card whether SLI is enabled or disabled in NVidia settings (I know you understand that part). HOWEVER: You actually need to disable SLI to take advantage of all installed GPU's for things like GPU rendering. Things like VRay RT (GPU rendering) will automatically take advantage of all installed graphics cards even if there is no SLI bridge installed :)

Felt like I should explain that just in case.
 

NotADesertRat

Commendable
Mar 1, 2016
1
0
1,510



So -- This has been a very helpful thread, and an amazing bunch of research. But I'm left confused based on conversations I had with NVidia reps. I'm developing a flight sim lab and the developer of our IG software (subrscene,org) recommended dual GTX 980 cards on a core i7 extreme to get three 2560x1440 video streams out at 60 hz. Then, just last weekend, a quadro guy from NVidia said, "sure, that'll work" but dual Quadros work just as well if they're configured correctly. When run on quadro-SLI-certified machines (a very,very small list), dual quadro cards can present as a single GPU and manage loadsharing between them, something the SLI cards won't do, he said. Worse, I'm in the unfortunate position of (thanks to bureaucracy) of being unable to buy a couple of test systems...
 

creationsof12

Distinguished
Sep 11, 2011
65
1
18,635


Glad you brought that up, NotADesertRat. According to NVidia, yes, Compatible quadro cards on compatible machines SHOULD increase your fps but... I've tried this multiple times with our compatible cards and compatible machines and it has never worked. Had no increase in fps in... well... anything. BUT we have not tried SLI with these cards: K6000, M4000, M5000 or M6000. So it may work with those. Unfortunately I was only able to get one of these cards at a time for testing so I never got the chance to try SLI.

Would love to be able to test this though! I'll try my best to get ahold of 2 of these cards but... it's a fat chance that I'll actually get them. Graphics cards companies normally only lend out one (demo) card at a time :/

If you end up going for Quadro cards in SLI, let me know what you find out! :)
 

Vijay Angester

Commendable
Mar 1, 2016
7
0
1,510
Very usefull thread. But fot a final decision, will a Titan x good for pro workstation with i7 5960x and 64gb ram? Or i really need a quadro ? am confused.

Software i am going to use is maya, nuke, cinema 4D , ae, ph and blender .
This is going to be my new workstation for the next 3-5 years.

help me..
 

creationsof12

Distinguished
Sep 11, 2011
65
1
18,635


Yes, The Titan X is very good for the software you'll be working in (and the hardware you'll be working with). Especially Maya, C4D, Blender and other 3D software that will take advantage of the GPU's (graphics card's) processing power. You do not need a Quadro card to work in any of those. In fact, the Titan X will outperform most of the expensive Quadro cards in these programs.

Thank you for your interest in this thread! Glad it's helping people out there!
 

SoNic67

Distinguished
This is not only useless but also deceiving. Did you even run any real benchmarks for this? Specviewperf is one of the well established tests for workstation software and it will tell a different story. GTX cards will be crushed by the "weaker" Quadro with the exception of Titans maybe.
My GTX 960 is similar with Quadro K2000 in that test.
 

creationsof12

Distinguished
Sep 11, 2011
65
1
18,635


Thanks! So glad you find it useless :D lol.

Yes, I ran "real benchmarks" for these cards. But what I found is that... those benchmark scores are "useless." The scores don't seem to correspond to the performance I get in the listed programs. These are real-world tests, in actual 3D software, not just testing in a graphics card scoring program. I just do my tests, copy and paste the data into the google sheet. I even install the cards into workstations and custom builds and do actual work on them for weeks at a time.

If you don't believe my test results, feel free do them yourself. In fact, I would really appreciate if someone did that :)
 

SoNic67

Distinguished
20341.page

20379.page

As you can see is not that "clear" like you posted. A puny K2000 (384 Kepler cores @ 954MHz, 732GFlops SP, 30 GFlops DP) keeps up with the GTX960 (1024 Maxwell cores @ 1127MHz, 2300 GFlops SP, 72 GFlops DP). And it actually crushes it in creo, SNX and sw.
The newer K2200 (640 Maxwell cores @ 1000MHz, 1280GFlops SP, 40 GFlops DP) is clearly above.
Moral of the story: The theoretical performance and real life performance are two different things.

BTW M6000 in SNX gets 172. In Maya gets 120-130. In Catia 150-170.
Is any GTX even close of that, including Titan?
 

creationsof12

Distinguished
Sep 11, 2011
65
1
18,635


Heh. Yes, I completely agree with your statement: The theoretical performance (SPECviewperf scores) and real life performance (actual viewport or GPU rendering performance in maya, 3ds max, C4D, etc) are two completely different things.

What I have posted on my charts is still not false, misleading or deceiving. It's just the data I've collected by doing the tests in the software I work in. Really though, don't take my word for it. Test it for yourself. I would be interested to see if others come up with different results. I would be very grateful if you tested the M6000 and GTX Titan X back to back in the same scenes and posted your findings.
 

SoNic67

Distinguished
What you posted as "conclusion" is false and misleading. Results are not even in the same ballpark like you try to suggest here:
Current conclusion: Go with (MUCH cheaper) GTX cards over Quadro cards for workstations using professional software such as 3ds max, Maya, C4D, After Effects, Photoshop, etc. Quadros do have a slight advantage over their sister (almost identical) gtx cards due to the drivers but the newer GTX cards can hit the same marks for a much lower price.

My conclusion is: Advantage of using Quadro is important and is strongly dependent of the software you plan to use. Research before you buy, and never rely on a gamer to recommend your workstation card. If you just paid $10K for the software license, and sometimes paying rent and utilities for the work space, to cheap out and buy a GTX that will cause only headaches (extra heat, power supplies not being adequate, and actually lower performance) is just bad business.

I work daily in this context and I know that you can recover the costs by increased productivity in just a few projects (maybe one if is a big one). I did run other tests too (like cadalyst that requires AutoCAD installed on the PC to run) and I was tempted to try using cheaper GTX at work, because of articles like this one. I proudly brought in my super-duper GTX cards, replaced the power supply to handle the extra power requirements, installed the GeForce card and drivers only to find that I just crippled my workstation speed. Even with modded Quadro drivers installed on GeForce (inf files) results didn't change. Even modded the actual cards (see below). Waste of time.
Sure, for a kid that wants to fell good that his gaming rig that is "as good as pro workstation", this article is great, makes them feel fuzzy and warm about the money they spent.

Also, the drivers are not the only "culprit" here, even if people assume that all the time. I modded the firmware straps on Fermi gen GeForces to make them appear as Quadros (GTS450 in Quadro 2000, GTX480 in Quadro 6000) and, even if the BIOS, drivers and every other software was recognizing them as Quadros with higher frequency, and even if I had access to all the extra features derived from those drivers (like GPU pass-trough for the modded Q 6000), the benchmarks still remained poorer than the real things. Same happened to people that hardware modded Kepler generation cards (for those nVidia made it harder, the straps are actually small SMD resistors).
I assume that there are some other modifications directly "burned" into the actual chips before packaging that 'unlocks' the OpenGL performance. Sometimes AMD GPU's might give you an edge (better OpenGL), sometimes not (Autodesk migrated to DirectX on their products). See the results for Showcase in SpecView tests, but keep in mid that sometimes DirectX has limitations, like in Maya.
 

creationsof12

Distinguished
Sep 11, 2011
65
1
18,635


To: SoNic67

I feel like you're still under the impression that I am trying to trick people for some reason. I can assure you, I am not and have no reason to do so. These are the actual results from my testing. I didn't make them up. I'm just trying to help others to achieve what I was trying to achieve: Get the best viewport performance for the price (in the software I work in). I am not a GTX salesman or anything. I'm not making any money from this... in fact, I'm loosing a lot of money by buying and testing these cards and then selling them as used. I'm going out of my way, spending a lot of my own time and money to help others and the company I work for. I work in the graphics industry (not gaming...?) and decided to take it upon myself to do this because I was having trouble finding good comparisons elsewhere (for pro 3D software).

For AutoCAD, the quadro cards might be better. I wouldn't know. AutoCAD is one of the programs I didn't test it in since I rarely use it and when I do, I have never had any sort of slow down due to insufficient GPU power in CAD. I'm not working in big enough CAD files to notice. But yeah, honestly, just go with whatever card you feel like going with. If you work in AutoCAD and a Quadro card is giving you better viewport performance, then by all means, go with a Quadro card.

To everyone else:

If you use the software I've listed and are looking to get the best viewport performance for your buck, feel free take a look and read everything in my chart. The GTX 980 SC will give you around 3 times the performance a Quadro K4000 in 3ds max. Why do I compare those two in this instance? Because they around the same price range.

I've also spoken directly with some of the higher-up people from Autodesk ( both in person [they work in the same building and sometimes come down to visit] and over the phone) and they agree with my chart in that many of the GTX cards will provide better performance than Quadro cards at the same price. I know that at least one of them actually uses my chart to send to customers that are looking for a good graphics card solution for max.

PS. I am not a "gamer" besides playing a video game every once in a while when a new game comes out that I want to play (seriously like maybe a couple times a year). I am a professional graphics designer that works in software like 3ds Max, after effects, nukex, etc (basically the software I've listed) doing Architectural visualization.