Pro Graphics: Seven Cards Compared

Status
Not open for further replies.

zajooo

Distinguished
Aug 13, 2008
3
0
18,510
Hi guys,

I was allways wondering, how it would look with comparison of these cards with gamers cards, wheather the gamer ones did not give us as we say "Lot of music, for a little money". Whould we get simmilar results in the field of geometry transformation, without using any alliasing a any of the really not necessary features ???

Thanx
 

Evendon

Distinguished
Aug 13, 2008
1
0
18,510
I would also like to know how these cards handle games. Especially im interested in how mobile versions of quodro fx work with games.
I would also like to play with my work laptop with quodro fx-chip (fx 3600m)
 

neiroatopelcc

Distinguished
Oct 3, 2006
3,078
0
20,810
I was pretty much wondering how low a geforce 8800gt or ati 4750 would stack? given the lowest prices in denmark for a v3600 or fx570 start at 1150kr and a 8800gt256 costs only 680kr (4750 @ 1070kr), they're comparable in the budget department.

We're only doing basic grid stuff mostly at work, but we're using intel 945 chipsets with onboard graphics for that - the pentium dualcore seems to do that okay, but it isn't happy about textures at all. However for those things we use old p4's with geforce 6600gt cards - not quadro anything, and I can't help to wonder if using gamer cards is the right or wrong choice (worked great for us so far).


ps. we use inventor, mechanical desktop and revit from autodesk (newest and second newest versions only), so only the 3dstudio results are interesting for me.
 

venteras

Distinguished
Aug 13, 2008
2
0
18,510
Where's the Quadro FX 3700 and 4700?

I don't see how you can do a review dated 13 August that seemingly cover all current main stream cards and not have 2 of NVidia's primary cards included?

Was this paid for by ATI?
 

sma8

Distinguished
Aug 13, 2008
3
0
18,510
[citation][nom]zajooo[/nom]Hi guys,I was allways wondering, how it would look with comparison of these cards with gamers cards, wheather the gamer ones did not give us as we say "Lot of music, for a little money". Whould we get simmilar results in the field of geometry transformation, without using any alliasing a any of the really not necessary features ???Thanx[/citation]

Pro graphic cards are different from gamers/consumer cards. Pro graphic cards are designed to be capable of handling workstations applications such as AutoCAD or 3D Studio MAX whereas gamers/consumer cards are designed for desktop pc apps or games. You can see the differences of both cards here:
http://www.nvidia.com/object/builtforprofessionals.html

That's a good example how both cards work in workstation application. That's why pro graphic cards cost very expensive
 

anonymous1000

Distinguished
Aug 13, 2008
4
0
18,510
Hello. Thank you for the interesting article. What is also interesting is the huge gap between Spec results and my day to day experiences with ATI pro cards. When the first spec results showed up, I started recommending my clients to buy FireGL 3600-8600 cards but nfortunately, they where VERY pour performers in 3DSMAX work... apart from the fact that you couldn't even use half of their strength with the first drivers, even now, comparing the FPS count on the V5600 card with that of 9600GT on 3DSMAX shows it is much more comfortable to work with the latest.. If you want to move a large project on your viewport it moves allot faster if you have a 9600GT card installed. THIS is a benchmark I would like to see here because IT REALLY MATTERS to animators. Thank you
 

bydesign

Distinguished
Nov 2, 2006
724
0
18,980
[citation][nom]sma8[/nom]Pro graphic cardshttp://en.wikipedia.org/wiki/Graphics_processing_unit are different from gamers/consumer cards. Pro graphic cards are designed to be capable of handling workstations applications such as AutoCAD or 3D Studio MAX whereas gamers/consumer cards are designed for desktop pc apps or games. You can see the differences of both cards here:http://www.nvidia.com/object/built [...] onals.htmlThat's a good example how both cards work in workstation application. That's why pro graphic cards cost very expensive[/citation]

Not true I can flash the bios on my 8800 GTX and it will run just like it's workstation cousin. They are using the same hardware but handicapping the consumer card.
 

theLaminator

Distinguished
Jul 21, 2008
127
0
18,680
Wish this article had been published about three days ago, it would've made my decision on which new laptop to get for school (i'm an engineering major, so i actually will use this). I finally decided on an HP with with the FireGL v5600, looks like a made the right choice based on these benchmarks. Guess we'll see when I actually get it and try it in application.
 
G

Guest

Guest
"I would also like to know how these cards handle games. Especially im interested in how mobile versions of quodro fx work with games.
I would also like to play with my work laptop with quodro fx-chip (fx 3600m)"

I also have a Quadro 3600M in my laptop. In 3dMark06, I get a score of 8800. COD4 and UT3 run smoothly with all settings maxed out at 1900x1200. Quadro cards are as good or better at gaming as their Geforce equivalents.
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
First off, I have to say that this has been a very good review. I naturally, have several things I'd like to complain about. I mean, as a reader, its our god given right to complain and never be satisfied. Right?

The good- you exposed and explained each GPU very nicely, noting everyone's gaming counterpart, included specifications for each card and commented on in with high detail.

So if everything is so well and good, why do I complain?

Simple- the tests. You're using the SPECheatTest. It’s well known that this test is optimized to show that even the crippled "workstation" cards outperform the much hardware superior "gaming" cards. The fact is, at least 3 programs you tested here today don't reflect the "bell" conditions posed by the SPECheatTest. Well, actually its 2. You haven't tested AutoCAD and I cant really comment on other applications as I am not familiar with them. The 2 applications I am familiar with and fully competent to speak on their behalf are 3ds Max and Maya.

What do these 2 have in common other than being under the same roof now? DirectX support. ANY application supporting DX is not being crippled in drivers when running it. You will see huge leads of ATI cards over nvidia here. But I'm getting ahead of myself.

In all tests you failed to mention that OpenGL is horridly slow, even on these “professional” cards. This is, of course, in the case when you can chose between OGL and DX in the same application. Not only is it slow, but its visually incomplete as it is lacking functions for visual quality display of lightning and shadow conditions that only DX 9.0c can display. I’m willing to forgive you the last one, as you maybe only tested the thing and not checked for visual quality differences. That’s not saying that SPECheatTest you used, can disply these real life conditions.

So with these 2 things in mind, it’s easy to see that only idiots or people unable to use DX would use OGL instead. To be fair, you did say that nvidia’s 8800gtx... erm, I’m sorry I meant quadro fx5600 is the best OGL card- and, I agree. Nvidia uses archaic logic in their hardware and OGL fits that pattern perfectly. So, its no surprise that nvidia should win. If you have to use OpenGL for any reason, nvidia is your man... erm company.

If however, you’re not stuck on appalling Mac platform or with archaic software that doesn’t support DX, it should be mentioned that ATI has significant lead. Not surprising either, since it has 3x more shaders at the same price. Games can’t use these well most of the time, but this is not the case with digital content creation (dcc) programs, like 3ds Max, Maya or AutoCAD.

Speaking of games- You did mention that 8800gtx=5600fx is 2 generations old. You failed to mention that the latest gtx280 has 240 unified processors and you failed to add it to the tests. Not that I think that SPECheatTest would show that its 2x faster, but the fact of the matter – it is. For any of the 3 above mentioned programs, and likely all others that support DX (possibly OGL, but I’m not sure how crippled it is in drivers- more on this later)

Which brings me to the ultra high end of dcc world- ATIs latest 48x0 cards. These have 800 unified shaders. They are just wiping the floor with all the cards mentioned in your article put together(!). All of them together (if it were possible) don’t have enough power to compete with even a single new card from ATI.

You conclude that ATI is the best deal at 1000$, but you fail to notice and differentiate outdated OGL programs and new D3D. So the ATI card is the absolute undisputed winner of CURENT dcc. Crown can, in no stretch of imagination, go to nvidia- unless you mention that it’s for outdated and OGL programs exclusive- in which case it does get it. Also, I’d again like to mention that the fact that you’re using SPECheatTest, isn’t helping you build your case either. And in addition to all this, you also failed to mention that you have 10x more powerful card for less then 300$ - HD 4870. Difference is solely in drivers and than not every part of drivers, but just the OGL implementation.

Which finally brings me to drivers: All the professional cards are 99% same as their gaming equivalents. They differ only in drivers. You said so, and I agree. What you failed to mention is that the professional cards are actually noticeably slower than their gaming equivalents. For stability they say. I challenge anyone to prove that gaming cards are less stable.

There are 2 reasons for card instability. 1)Inappropriate cooling or 2) poorly written drivers. Slowing down the card will make it produce less heat and thus (in theory) more stable.

Problem with this nice theory is that NO dcc program available today can stress any of the cards mentioned here today beyond their framebuffer capacity. Chances are that your CPU will choke waaay before any of the cards do. This is due to a simple fact, that viewports, like 95% of other things today, can utilize only single CPU core.

Which again, brings me back to drivers- as the only other cause for instability. Here is another interesting fact you might not have known: People writing drivers for the 99% identical cards don’t do it 2x. Process may vary, but in a nutshell it’s like this: they write the drivers for the gaming card. At this point they write 2 different software types in it, either games or dcc software. The dcc software drivers go to more testing on the dcc software and in 99% of cases it’s done there and published. It will be thoroughly tested to notify that there aren’t any major bugs and than shipped.

The games driver path will not end there. The driver programmer has one more duty: to cripple performance under what he deems to be “professional” software(!!!). So to reiterate this: the programmer of drivers, instead of perfecting them to be better, actually sits down and starts writing core to CRIPPLE(!?!)the gaming line of the cards. One would image that he could spend his time employing his talents elsewhere.

So, crippled or not- drivers are the same 99%. If instability is brought on one line of the cards, the other isn’t spared with "superior" drivers. So in, reality, the workstation cards are no more stable than their gaming siblings, even if many would like you to believe that.

In conclusion, I’d also like to nitpick and the fact that you use very low resolutions for testing- capping at 1600x1200. As you might have guessed, anyone interested in working 3d will start at that resolution minimum. Not end on it. This is not a serious overlook as you have been using the SPECheatTest to test everything, so your results should be taken with a grain of salt anyway.

The add-on "Maxtreme 11" driver from Nvidia is also interesting. This plugin was developed specifically for 3D Studio Max, and leads to a significant performance boost in this program. In contrast to the previous versions, Maxtreme 11 supports OpenGL and also the DirectX API. The hardware shader operations of 3DSM especially benefit from it.

Maxtreme was once useful. Some 8 or so years ago. Ever since DX entered dcc programs OGL and Maxtreme and the likes of it have been dying. Last couple iterations of maxtreme were nothing more than DirectX with a fancy name to make qudro buyers happy and make them feel special. It held no visual or speed advantages whatsoever. Than again, it introduced no ill-effects ether.

Honestly, I haven’t tried r11, but I seriously doubt it brings anything new (since its not technically possible). And as for OGL support in it, just goes to show how much they know about 3ds Max. With OGL you cannot enable viewport shadows or any of the advanced viewport lightning techniques possible only in D3D. So, as I said before, OGL is not only seriously slower (and I mean seriously) its also lacking much in the visual quality department as well. I’ve said it before and I’ll say it again: only idiots or people unable to use DX will opt for OGL.

But here we recognize that DirectX is slowly becoming acceptable in the workstation sector, which was previously reserved exclusively for OpenGL.

And when you say slowly you sound like its 2001 all over again. Newsflash its 2008- DX has been a defacto standard for about 7 years now. True- not all dcc will be reflecting this- like AutoCAD that had its first DX support added with launch of Vista. But let’s not kid ourselves here- these software are really only crippled version of actual dcc leaders like Max, Maya, XSI, Lightwave... And let’s be serious for a moment- those software work beautifully with 4 generations old hardware because of their inherent purpose.

Thank you for reading,

That was short... maybe I should have published it :p
 
G

Guest

Guest
Also why test AutoCAD for 3D use? Inventor is the Audodesk 3D modeling product. AutoCAD is so 10 years ago.
 
G

Guest

Guest
hi !
sorry to borrow you, but as you and reader explain, cards "pro3d" and "gamer" are about the same.
and since you will get only 1 per system, it will be VERY INTERESTING to tests "pro3d" and "gamer" cards with all 3D AND game benchmarks you get.
because, when you bought a workstation for personnal use (i know a lot of friends doing this), you like also to relax you by playing a game !

and i'm pretty sure that you will see that the top-end current "gamer" cards (with standard drivers) are much more bang-for-the buck that their old "pro3d" equivalent (with specialized drivers).

an other example is Apple, dell, or HP selling you quadrofx card in systems designed for... video and multimedia works. why ? pretty sure that a gamer card will do the same job (or better) for less money.


fredsky
 
G

Guest

Guest
and also a comment about "certified drivers" and "high 3d applications"

1) as @eodeo wrote "Chances are that your CPU will choke waaay before any of the cards do. This is due to a simple fact, that viewports, like 95% of other things today, can utilize only single CPU core." this so a SHAME man !!! common, we got 4 vieports and plenty of cores. autodesk and others, WAKE UP, and use one core per each viewports !!!

2) i was working to certify autodesk 3Dmax with nvida cards/drivers. in fact, you have to take all the drivers avaible for the quadro, and tests them (scripts-driven benchmarks). and then you will approuve or reject a driver, quality wise. so you get OLD hardware and also OLD drivers... so expensive work to got finally CRIPPLED things.

fredsky
 

snipster4

Distinguished
Aug 20, 2006
58
0
18,630
This should be tested in Vista 32bit/64bit. I know a lot of CGI artists that are using Vista with DX10. Almost all Professional apps run under Vista DirectX10 ie AutoCAD 2009, Inventor 2009.

I have 5 Quadro 1500 cards running under XP 32bit and Vista 64bit. My one gaming machine with 8800GT OC card under Vista 64bit kills all my other computers running Quadro cards in AutoCAD 2009, 3DS Max 2009, Inventor 2009.

Pro cards are no use anymore as they are optimized for OpenGL and most software manufactures are removing/or limiting support for OpenGL case in point Vista.

I wish I would have saved my,$3,500 in Quadro cards and purchased more 8800GT or even ATI 4870X2. Note not much different from Quadro 1500 to Quadro 1700 in real world comparison.


I would like to see some benchmarks with Current GeForce 280/ATI 4870 cards vs these Pro Cards under Vista DirectX10.
 

yyrkoon

Distinguished
May 8, 2006
387
0
18,780
There are 2 reasons for card instability. 1)Inappropriate cooling or 2) poorly written drivers. Slowing down the card will make it produce less heat and thus (in theory) more stable.

This is not true. Improperly written applications, improper power(for whatever reason), and hardware compatibility are just three. I can probably think of more.

Perhaps you meant when everything else is working perfectly?

Just as an example: A few years ago when a buddy of mine built his P4 system we used a certain namebranded motherboard, coupled with an ATI 9600 pro which was factory overclocked. The motherboard had very tight tolerances for the AGP bus, and the video card drew slightly more power than the specification called for. Technically the video card should have had a plug for aux power, and someone finally came up with a mod to fix the card with these motherboards by adding one. Anyways, the end result was Windows XP would seemingly randomly lock up 1-3 times a day.

Anyhow my point is here is that there are more than just two potential problems in a situation like this, and it surely is a PiTA to troubleshoot such problems. Making general statements like this at the very best are only half correct.

Granted, professional grade application had better be written properly, and for $1000+ usd for a graphics card, the drivers had better well be written properly as well.

As for the rest of your comments . . . very imformative, thanks a bunch :)
 

sma8

Distinguished
Aug 13, 2008
3
0
18,510
[citation]Not true I can flash the bios on my 8800 GTX and it will run just like it's workstation cousin. They are using the same hardware but handicapping the consumer card.[/citation]

BIOS flashing to a Quadro doesn't work really for the GeForce 8 series. Cause just the BIOS flash doesn't make it a Quadro. The only way to get it to work as a Quadro is by using RivaTuner. And once again, no BIOS will turn it into a Quadro, NVIDIA has made sure that is not longer possible. Yeah, they learn fast from their previous mistakes.
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
I wish I would have saved my,$3,500 in Quadro cards and purchased more 8800GT or even ATI 4870X2.

I was trying to cut down on words and still get the main message across. Thus, I failed to mention that no “professional” application can use more than singe GPU- this means no crossfire/ sli support and also no x2 card support. This doesn’t mean that you can’t use these setups with them, just that they won’t utilize the extra GPUs. So games are only going to be using all the extra power, and in truth, only the games will need it.

I would like to see some benchmarks with Current GeForce 280/ATI 4870 cards vs these Pro Cards under Vista DirectX10.

That would seam interesting- but as far as 3ds Max goes- d3d under Vista is horrible. My 8800gtx uses less than 5% of its strength in Vista. It’s so much slower that it's actually slow(!). This has to do with the fact that 1) Vista uses DX10 primarily and Max 2009 just got full dx9.0c support/ features or 2) AutoDesk just did a poor job with the Max/ Vista implementation.

On the other hand, I’ve heard nothing but words of praise for AutoCAD under DX in Vista, so its most likely option #2 above.

This is not true. Improperly written applications, improper power(for whatever reason), and hardware compatibility are just three. I can probably think of more.

Yeah, but the thing with all of the 3 things you mentioned is going to affect both gamer and pro cards equally. That just falls into “vis major” and there’s nothing you can do about it. Debunking the belief that pro cards are inherently more stable was my only intent. Truth is, a squirrel can chew of your power line and crash your whole system, but its not going to matter if you have a quadro or a geforce inside :)

As for the rest of your comments . . . very imformative, thanks a bunch :)

Good to hear :)

BIOS flashing to a Quadro doesn't work really for the GeForce 8 series. Cause just the BIOS flash doesn't make it a Quadro. The only way to get it to work as a Quadro is by using RivaTuner. And once again, no BIOS will turn it into a Quadro, NVIDIA has made sure that is not longer possible. Yeah, they learn fast from their previous mistakes.

My whole point of the above written is to show that you shouldn’t want to turn your geforce into a quadro, even if you could. No reason to do so, really. Apples to apples, pro variants are actually slower than their gaming equivalents- because manufacturers slow them down on purpose- for "stability".
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
A few years ago when a buddy of mine built his P4 system we used a certain namebranded motherboard, coupled with an ATI 9600 pro which was factory overclocked.

I find it very interesting that you chose to mention ATI 9600 Pro. I just wrote about my old ATI card recently at the official Max forums, Area- here:
http://area.autodesk.com/index.php/forums/viewreply/78300/
 

snipster4

Distinguished
Aug 20, 2006
58
0
18,630
Apples to apples, pro variants are actually slower than their gaming equivalents- because manufacturers slow them down on purpose- for "stability".

I find out of my 6 computers running pro apps the ones running the Quadro cards crash more often than my GeForce 8800GT computer. I big one is transparency windows in apps like AutoCAD 2009 or with flickering in the display when 3D rotating objects to AutoCAD/Max. I find the 8800GT ($200) with the extra memory can handle the complex models with textures alot better than my Quadro 1500 ($600). I would have to get a $1500 + Quadro card to even try and compare it to the 8800GT. OpenGL is a thing of the past and will be completely obsolete in the next few years.

I think this is always going to be a debate between Pro cards vs gaming cards until people take them into real world performance when working on a project side be side and really notice the difference. We are switching all computers to Vista 64bit as XP64bit is junk and we require the 8+GB of RAM that's why these benchmarks should also have a Vista 64bit system and see how the tests come out.
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
OpenGL is a thing of the past and will be completely obsolete in the next few years.

That sounds really good. But Apple/ OpenGL are not stupid. They’ve seen that they’ve been overrun and they plan to release OpenGL 3.0 by the end of this year. As much as I don’t like OGL now, competition is good for the end costumer and letting D3D rule unchallenged is a bad thing in the end. Sloth is common when ruling unchallenged, so competition is good and it looks like we’re going to get some. How well will it perform and what addition will it bring, we will have to wait and see.

From wikipedia:
http://en.wikipedia.org/wiki/OpenGL
and it was released August 11, 2008.

So I’m wrong and it’s out. Now we only have to wait for OGL 3.0 compliant hardware. Or maybe not. The latest hardware is by ATI with DX 10.1 and shader 4.1 along with OpenGL 2.1. Will OpenGL 3.0 routines work on 2.1 compliant hardware? I don’t know.

I think this is always going to be a debate between Pro cards vs gaming cards until people take them into real world performance when working on a project side be side and really notice the difference.

There is no difference to be observed here, save for the drivers pushing the cards. And not drivers in full, just the parts that cripple the gaming cards.

The thing to keep in mind here is that the cards are NOT different in anything but the name/price. All hardware literate people know this, and the only sensible solution would be to shutdown the unnecessary line- quadro/firegl. Until people stop buying these cards, there is no reason for manufacturers to stop producing them. They might even go on the offence, and try even harder to cripple their own gaming line of cards via drivers, to work slower in “professional” applications. If persistent enough, I’m sure they can find a way to cripple even D3D operations on per application basis.

We are switching all computers to Vista 64bit as XP64bit is junk and we require the 8+GB of RAM that's why these benchmarks should also have a Vista 64bit system and see how the tests come out.

That depends on what software area you using. For me, Vista is unusable at the moment for anything but dx10 games, as it slows down my 8800gtx so much that its actually slow in 3ds Max. Win XP x64, on the other hand, allows me full GPU speed along with rock stable OS that fully supports all of the 16 exabytes of ram and thus all of my 8gb.


I use Vista, but only to play DX10 games. I even browse the web in Win XP x64, as Vista is just plagued with backwards incompatibility and I use k-meleon browser. Save for its Vista integration, I think its the best browser available today. Firefox 3.0 is getting closer, but not yet- and it even doesn’t support mouse gestures and previous plugins don’t work at the moment. Opera 9.5 is also close, but lacking in too many spheres to be directly compared. IE is just out of its league here.

On the other hand, as I’ve mentioned before, I’ve heard nothing, but he words of praise from the AutoCAD users under DX in Vista.
 
Status
Not open for further replies.