Quadro 5000 worth it now?

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010
I found a Quadro 5000 for $130. It won't be for gaming. Is it worth the same price as a Quadro K620? Or would the K620 be better?

Looking at benchmarks the 5000 seems to still be better despite its age.
 
Solution


I am earnestly trying to flipping help you. No need to be condescending and besides, not only am I not a mind reader but not everyone knows everything about anything and so gaps in everyone's knowledge are to be expected when it comes to computers as well. Me included.

Is that comparison used price vs used price? It's not exactly fair if it's used vs new. What are you using it for? You may not even see the difference depending on what you are doing. You might also want to check compatibility with the software you use. The 5000 uses triple the amount of power.
 

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


The prices are used vs used.

The use case is Blender animation.

Power isn't a problem, I have a 750W power supply.
 

Immitem

Reputable
Jun 20, 2015
115
0
4,690
Blender has very few benefits from using Quadro that you would only see if working on incredibly heavy scenes but even then it is only relative to a GPU of similar power in the same architectural family.

For example, a Quadro 5000 would perform better than a Geforce 470 card, due to driver optimisations, while a GTX 1050 TI 4GB will outperform both for a slightly higher price, using much less power, and having more VRAM. I have a Quadro K4000 that I use for funsies in Blender (among other programs) and it performs better than a GTX 660 card but the GTX 760 soundly beats it as its raw horsepower offsets any driver benefits. If you were talking about Maya it would be a different story as there are still noticeable performance boosts that you can receive from pro-drivers (despite the migration to a DX11 viewport) but then only if you are working on incredibly heavy scenes with thousands of objects AND millions of polygons where simple tasks would suddenly start to hitch very frequently.

If you are in the states then this would be your best bang for buck: https://www.newegg.com/Product/Product.aspx?Item=N82E16814487290&cm_re=gtx_1050-_-14-487-290-_-Product

Up to 4 times more performance (while boosting) while consuming half the power plus an extra gig and a half of VRAM!

If this is simply a project for funsies (like my own) then it is not necessarily a bad card but I remember reading that it runs stupidly hot, as do most Fermi cards, so it may not have very many miles left in it due to wear'n'tear.
 

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


Actually, the Quadro 4000 is based off of the 470. Not the 5000. I've compared the PCBs of both. The Quadro 4000 v1 (grey and black, red topped caps) is almost an exact replica of the GTX 470 reference blower card minus some capacitors and PCB holes. The Quadro 5000 is a stripped version of the GTX 480 and I'm fairly certain a BIOS flash and PCIe ID resistor reconfig fixes this, but you'd be better with overclocking the quadro to match the GeForce clocks anyway.
 

Immitem

Reputable
Jun 20, 2015
115
0
4,690


That was not the point I was trying to make. The point is that Blender does not specifically benefit from Quadros outside of natural driver optimisations (not talking about certification) and that a Fermi Quadro will beat a Fermi Geforce card of similar or somewhat higher raw performance and you will find that over multiple architectures from my own experiments.

There were some weird outliers from the past whereby cards from the G80 Tesla series (GTX 8800/FX 4600) performed better in some software like Blender and 3DS Max over the G92 and up Tesla series (GTX 980/8800 GT/Quadro FX 3700). Not unlike how Fermi has better raw compute power over Keplar which is why my Tesla C2075 performs better in Cycles than my K4000 despite the latter appearing more powerful on paper but for different reasons. - I forget where I am going with this.

In the end unless you are making a novelty workstation, have proprietary software that is geared for a very specific card (specifically the Quadro 5000), or need 10bit support, getting a more powerful low-end Geforce or Radeon card would be the best way for you to go.

EDIT: If you really need a professional card then a Radeon Pro wx 2100/3100 or Quadro P600, depending on whether you need CUDA and how much VRAM (2-4GB), would offer better overall performance and not cost that much more though I am not sure where you are from and thus your currency conversion. If you have really bad local taxes that makes computer hardware stupid expensive then you can get what you can and I would recommend the Quadro 5000 over the k620 due to the former's raw compute abilities.
 

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


I just want more of a "is this really worth $130 for an old Quadro". Personally I don't think anything from Fermi or Fermi Refresh is worth over $100. I got a GTX 580 for 80 bucks last summer and should have flashed it to a Quadro. If Pascal wasn't such a pain in the neck I'd flash my 1070 to a P4000.
 

Immitem

Reputable
Jun 20, 2015
115
0
4,690


In that case it is iffy. It is not worth more than a hundred bucks but people refuse to sell it for anything cheaper so if you really want it you might be hard-pressed to find any cheaper listings for the foreseeable future. Something I want to reiterate is that despite being a cherry-picked GPU it is still an unnecessarily hot card (at least according to some reviews I have seen) and depending upon how long and how hard it was used may fail sooner than an even older card with a lower TDP and better cooling solution.

Mind you I have personally never owned one so I cannot comment on it myself so it is simply something to keep in mind.

 

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


I've owned 5 Quadro FX 3800s, with 4 in the same X58 Xeon build for animation. Those ran at about 88c when overclocked to 700Mhz on their cores under load when rendering animations in Blender. They're about $35-$40 each which makes them fairly desirable, being the Quadro variant of the 1GB 8800GT.

I have two Quadro 4000s in my animation workstation right now and for my current animation, it takes about 4 hours to render 10 minutes of actual content ready for post-processing. Using my 1070 for the same animation takes about 5 and a half hours despite being Pascal and much, much faster. The 5000 is not twice as fast as the 4000s together but each 4000 was around $80.

I won't have any additional money until after I finish with this animation so the faster it gets done, the better Quadro I can get. If you need more details I can provide them.
 

Immitem

Reputable
Jun 20, 2015
115
0
4,690


So you are actually planning on using the card for rendering in Cycles? In that case I must warn you that Fermi has some weird issues with Cycles. Several of the problems that I read about being fixed are all issues that I have faced when using a Tesla C2075 and C2050. The most notable one being some arbitrary limits on textures and geometry compounded by their relation to each other in VRAM (no matter how much is actually being used) but if you are not getting any CUDA errors then I guess your scenes are not intense enough to trip these anomalies.

The real thing though is those numbers for your rendering times make no sense. The BMW GPU Benchmark takes 2:50-3 minutes to render on one of my two GTX 1070s in my dual Xeon 2680v2 64GB DDR3 1866MHZ HP z620, scaling almost one-to-one with both cards together averaging a minute-thirty. My k4000, which is 10-15% slower than my Teslas, takes over 14 minutes!

What are your render settings because the 1070 should be so much faster.

Also, without touching any settings run the Blender BMW GPU Benchmark and tell me what times you get: https://www.blender.org/download/demo-files/

I am really curious as to what is going on here.
 

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


Is the GPU test supposed to use 100% CPU? It's putting my 4790K at 85 degrees and that really sucks.

One 4000 is currently occupied with something unrelated so I'll have to run it on a single card, but you said scaling was perfect so we'll just half the time. I'll post scores in a bit.
 

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


GTX 1070

6FU5ZL0.jpg


Single Quadro 4000:

UlkXqkh.jpg


I can't stand Windows 10. Defender blocked Blender twice before for high CPU and I had to turn Defender off with Group Policy Editor.
 

Immitem

Reputable
Jun 20, 2015
115
0
4,690
OK, looking at those numbers I am assuming it is a driver issue. Use Display Driver Uninstaller ( https://www.guru3d.com/files-details/display-driver-uninstaller-download.html ) in safe mode and uninstall all drivers. Make sure that the 1070 is the only card in the system and download and install the latest stable Geforce driver from Nvidia. I am not sure if you are running a Quadro and a Geforce card in the same system at the same time as it is not support and will negatively impact performance by a massive margin when it does work. It is safe to run Quadro and Radeon or Geforce and Radeon Pro/Firepro cards in the same system as they use different drivers, from my own experience.

http://www.nvidia.com/download/driverResults.aspx/133648/en-us

Lastly, your CPU getting hot, are you using the latest release candidate for version 2.8? It has the ability to use both the CPU and the GPU at the same time but unless your manage the the tile sizes it will result in using the CPU at 100% without any performance boosts. You should be seeing only such CPU usage at the very start of the render and nothing else.
 

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


Lol I know how to update drivers...if someone is asking about buying a quadro you should automatically assume they are an advanced user like me...I've had DDU under D:\Program Files\ for about three years now.

The latest nVidia drivers are unstable as sh*t, lower my performance, and are honestly quite terrible. They also break a lot of SLI configs. I have zero motivation to update past 381.65. It is not a driver issue as every other program runs fine, I even have AMD drivers installed for when I need to test incoming product returns for my side business.

"make sure the 1070 is the only card in the system" hahahahaha I've been building PCs for about 12 years now, I'm 110% positive my i7's iGPU is disabled cause I checked last night.

"Lastly, your CPU getting hot, are you using the latest release candidate for version 2.8?" I'm using Blender 2.79b, the lastest stable release. CPU drops to 38% usage towards the end of the test.

Considering my animation station is using an i3 4130 and it still beat out my 1070+4790K 4.8GHz, I'm guessing it's the Quadro Certified Drivers that are really helping. Both 4000s really slay when one isn't folding or doing something else, I'm thinking of using DifferentSLIAuto or HyperSLI so I can get SLI working properly.
 

Immitem

Reputable
Jun 20, 2015
115
0
4,690


I am earnestly trying to flipping help you. No need to be condescending and besides, not only am I not a mind reader but not everyone knows everything about anything and so gaps in everyone's knowledge are to be expected when it comes to computers as well. Me included.



What do you need SLI for?

At this point I have no idea as to what is going on and unless I could run some tests on your PC in person I cannot fathom what is impairing your performance so badly that your 1070 is losing to the Quadro 4000s. The only other thing that I can think of with the CPU running so hot and at 100% is that something got buggered in the settings so that it says that you have your CUDA devices enabled for rendering when in fact it is using your CPU with a unoptimised bucket size. I would make a backup of your user settings, reset Blender to default, then start it up again and enable the 1070 as the CUDA device and see what happens from there.

I will google around a bit but am otherwise stumped. Even with a Quadro 5000 and two 4000s rendering a 1070 should be running circles around them, it does for me.
 
Solution