Workstation-Shootout: ATi FireGL V7600 vs. Nvidia Quadro FX 4600

KwyjiboNL77

Distinguished
Jan 17, 2007
65
0
18,640
I am somewhat confused about how these tests were conducted. Why in the world have these tests been conducted with Maya 6.5? This iteration is years old, and subsequent releases have been heavily focused on increasing GUI/Viewport performance. When testing the latest cards available it would be smart to also test the latest available version of Maya. Another thing I don't get is why a 64bit version of Windows was not tested. Most workstations nowadays run with more memory than a 32bit OS can comfortably handle (combined with the amount of memory found on these cards and you run out of address space quickly), hence the reason that most 3D/DCC apps today have fully 64bit versions available. Maya has been available in 64bit for at least the last 2 full versions. To me the testing environment used here seems awfully shortsighted...
 

striker63

Distinguished
Nov 8, 2006
6
0
18,510
I also don't totally understand the decision of using the DirectX as the only reference in your testing. A large number of the robust 3D development apps are in OpenGL and are why companies drop BIG$$ on card like the ones being reviewed in today. OpenGL needs to be a consideration when reviewing workstation cards.
 

the_computer_dud

Distinguished
Oct 12, 2007
2
0
18,510
In terms of performance, how do these cards (and probably mainly drivers) compare to their consumer level counterparts? It's an age old question that no one seems to take time to answer, but seeing how the dies are the same these days is even more valid.

Also, what are the advantages these cards have, or what will a professional CAD or graphic designer gain by spending the extra money required for one of these models?

I've looked into this in the past and have really only found a couple differences, such as I think (along the lines of) "hardware accelerated lines" which was something disabled on the chips of the consumer level cards. But a question is what benefit do "hardware accelerated lines" provide anyway?

Also occasionally there are "performance drivers" available, were any of these used in the test (I know nvidia has them for Quaddro & Max).
 


Did they mention that it was a 32bit version of Windows? :heink:

I see no mention of that, but the drivers for the workstation cards are the XP64 versions of Cat and Forceware, so to me that says 64bit Windows XP.

As for the versions of sotware chosen, while there is a more current version of Maya, there is no accompanying version of test software from respected groups like SPEC, their laterst version is for 6.5;
http://www.spec.org/benchmarks.html#gpc

So just like 3Dmk06 doesn't test DX10 components of DirectX for gaming systems, there is no replacement for it yet. So until SPEC comes out with a newer version, for comparison purposes everyone is pretty limited to what is available and gloabally accepted as a standard. I wouldn't trust someone's own models(s) and tests with U08, I want recreatable test, and the Maya tutorials like Werewolf and Squid are getting old (show little difference even back in the GF7/X1K generation when there were greater differences between the cards).

As for OGL, striekr there are OGL tests there , why do you say only DX?
It looks like they chose DX version of 3DSMax because it adds variety and also because it is a growing component of the 3DSMax user base. It woul be nice to compare both as some people do, but really it's a judgement call.

Anywhoo IMO, the more information the better, so I'm happy for another review, but I do agree, hey I'd love the additional information of the OGL path for 3DSMax too.
Please Sir, may I have another benchmark/test. :whistle:
 

Ilander

Distinguished
Jul 22, 2007
173
0
18,710


Aye, I totally agree, this must be addressed by someone out there, and THG very well could have done it with this review...

I know that the consumer level cards offer absolutely fine base functionality, and they all support OpenGL 2.0 these days, so I keep telling people that they don't need workstation cards (people like my girlfriend, a studying-to-be architect).

It's an important factor.

Plus, I'm pretty sure ATI and nVidia would ship a lot more silicon if the same card worked top to bottom for both gamers, video-editors, and engineers...right now, it just seems like they're abusing the segment with supposed exclusivity only realized in select applications, which they could easily support with lesser products.

...and, of course, just one test with a game and a workstation card. Just one. Please. Can my girlfriend, who needs to run Revit well, also play the SIMS, or, something more advanced, like Portal?

It's a strange world we live in where you can buy a digital camera/mp3 player but you need one graphics card to design buildings and another to display buildings someone else designed in a video game.
 

niz

Distinguished
Feb 5, 2003
903
0
18,980
I totally agree that I'd love to see 3dmark06 for a quadro with 768mb and an 8800gtx on otherwise the same hardware/software platform.

My big theory is that the only reason companies buy quadros is mostly either elitism or ignorance rather than any real performance advantage.

My other theory is that the reason that no tech websites ever publish the results of a truly comparative performance test is political in that they don't want to be the first ones to blow nVidia and ATI's most profitable scam.

In fact I recollect some benchmarks maybe 4 or 5 years ago that showed the then-current chipset in a quadro ($thousands) and consumer video card ($hundreds) strangely showed the quadro was marginally slower at gaming. Also, back in geforce 2 days, there was some instructions floating around the internet how you could just swap 1 or 2 resistors over on a geforce 2 and turn it into a quadro (it would even identify itself as a quadro at bootup and the nVidia driver reported it was a quadro too) so there's not (or wasn't) any actual hardware difference.
 


They gain certification compliant part, higher quality control, specialized drivers, and better customer support. The value of these varies depending on the end user, pro-sumers don't value those aspects as much as true professionals, most of whom don't even pay for these, the company does and the ar either built into cost, amortized or written off.

Plus, I'm pretty sure ATI and nVidia would ship a lot more silicon if the same card worked top to bottom for both gamers, video-editors, and engineers...right now, it just seems like they're abusing the segment with supposed exclusivity only realized in select applications, which they could easily support with lesser products.

I doubt they'd sell anymore silicon after the initial blip, and even that would end up likely being less than 1% of their total. However if they removed the FireGL hardware requirement they'd lose alot of their Quadro/FireGL sales if they offered all of those benefits, thus reducing their resource base for paying for those added workstation optimizations, etc.

Anyone wanting many of the benifits of both while reducing the price should look towards SoftGL / SoftQuadro solutions that give you alot of the best of both worlds and you can switch back and forth pretty easy.
 

justinmcg67

Distinguished
Sep 24, 2007
565
0
18,980
I was really looking forward to this article; in-fact when I opened up THG and saw this as the headline I literally said out loud, "ABOUT FU**ING TIME!! WHOO HOO!!" But I'm a little disappointed in this. I would've liked to know how these benefit peopel who use AutoCAd and what not. My father is a Drafter and uses AutoCAD religiously. I tried to recommend him and the owner of the company he works for a workstation card but couldn't because I had no idea what to get. Oh well...maybe next time I suppose.
 

Ilander

Distinguished
Jul 22, 2007
173
0
18,710
Softmodded 9800s are...well...obsolete. I only mentioned it as a reminder that in the past, we could do that. I heard that they worked fine for their time period.

I think the more important factor than a workstation card is having a very beefy processor, unfortunately. Maybe workstation cards take some of that load off, as compared to a gaming card, but I'd probably rather spend the $250 on a quad core and $100 on a graphics card with OpenGL 2.0 (and maybe squeeze the budget for 512 MB VRAM) than spend the $1000 on even the Fire card...and I don't think spending $300 on a really crappy Fire/Quadro card would be worth it at all. Ever.
 

KwyjiboNL77

Distinguished
Jan 17, 2007
65
0
18,640


Page 7, test setup:

Operating System: Windows XP Service Pack 2

Since XP64 has no SP2, and no special mention is made aprt from it being XP, to me that means they haven't tested in XP64...
 

LAN_deRf_HA

Distinguished
Nov 24, 2006
492
0
18,780
I would have liked to see the V8650 vs. FX 5600 to know who holds the performance crown on the high end. Also I like it when they throw in a game benchmark for the heck of it.
 


It's rare you see AutoCad tests, even folks like 3Dchips and 3D proffesor who test more than most don't use AutCad much for testing.

For some people Xbit's test may have the pieces they find missing in the THG article with AutoCad and 3Dmark06 results in their test batch, but they haven't used a new FireGL just the old X1800 based 7350, which are weak compared to the refreshes (the HD2600 based 5600 beating it). If look at the AutoCAD situation it's not really stressing the cards anywhere near other 3D modeling tools.

http://www.xbitlabs.com/articles/video/display/quadro-fx5600-fx4600.html

Hope that helps, would be nice to see at least a FireGL 5600 or 7600 in there for a more recent comparison of cards.
 

Jimbo1234

Distinguished
Dec 13, 2006
6
0
18,510
For those of you confused as to why you would spend so much money on a workstation card instead of a gaming card for professional applications, here are a few reasons:

1. Certified to work with your application. By that it means that you get accurate output, no driver related crashes, and solid repeatable performance.

I don't recall if this was mentioned in the article, but CAD software, more specifically, solid modelers, are nothing like games. Solid models for engineering use are exact. All extrusions, holes, rounds / fillets, are modeled and shown as such. Frame rates are not too important. In games, geometry need not be exact, as long as it looks good. Looking good will not create dimensioned drawings of real parts with tolerances, etc.

2. Autodesk AutoCAD is 2D software for drawing lines. You can argue otherwise, but that's all it really is. 3D functionality was added over time and is still terrible. It is legacy software that still has a purpose, but it is not designed for 3D solid modeling. Autodesk Inventor on the otherhand is. Inventor is very similar to Solidworks. Both are mainstream solid modelers. There are also high-end modelers like Unigraphics NX.

3. D3D is becoming more accepted in the CAD world. Inventor R11 can use either D3D or OpenGL. Inventor 2008 uses D3D exclusively. It is important to test both because not all cards work equally well in both environments and not all programs support both APIs. Previous generation ATI cards were very slow in OpenGL, but fine in D3D. nVidia cards perform equally well in both APIs.

4. Here's the major reason to get a workstation card. Working with multiple files open. A gaming card with choke if you open 5 or more part / assembly / drawing files at once. Workstation cards do not. When you design things, that is the typical environment.

5. In business time = money. Capital purchases are not so costly. I will spend a $1000 on a gfx card if it will save me even 30 mintues per day. Productivity is worth much much more that. I used to have a gaming card in my workstation when I first started at my job because nobody knew any better. I opened a few parts, and the machine essentially locked up. Losing work and then trying to recreate even simple things takes a lot more effort than a measly $1000-2000. I spent over $20,000 a few weeks ago for some mirror holders. Graphics cards, even in the price range above are cheap business purchases.

6. Again, workstation cards are not for gamers / home users. They were never inteded to be.
 

randomizer

Champion
Moderator
Well gaming cards are designed to run demanding apps, but only ONE at a time. Most gamers aren't throwing guys into buildings in crysis while sniping on another monitor in BF2. Plus some workstation cards are massive, so they are inherently more expensive to make. Not the massive price difference that you see, but still more expensive. Being in the smaller segment of the market forces up the price too.
 

niz

Distinguished
Feb 5, 2003
903
0
18,980
@Jimbo1234:

Sorry but nothing you said there suggest to me that a consumer/gaming version of the same GPU won't do just as well.

>> I don't recall if this was mentioned in the article, but CAD
>> software, more specifically, solid modelers, are nothing like
>> games. Solid models for engineering use are exact. All
>> extrusions, holes, rounds / fillets, are modeled and shown as such.

You're missing the point. The fact that the CAD application internally represents the objects with great detail and precision has nothing to do with the graphics engine or hardware. When a complex object comes to being displayed the engine will still use algorithms like hidden-surface removal to avoid having to render surfaces you can't actually see. In fact there's probably way more image complexity in any frame of a game like crysis than there ever is in something like the display of a car engine or whatever.

Also FYI, D3D does not calculate or show exactly correct perspective etc. (although OGL does). So if engineering labs are using D3D more now they've just blown your argument about needing precise graphics cards for CAD out the water right there.

Also for a couple of years I've worked in an office where engineers were using CAD. They NEVER take measurements directly off the screen so precise graphical accuracy is a moot point anyway.

I still maintain that if you were to benchmark a CAD workstation using a quadro then again with an exactly equivalent consumer-level card (same GPU, RAM and clocks) you would see exactly the same performance (notwithstanding any artificial crippling by the graphics driver based on detection of videocard product ID ).
 

Jimbo1234

Distinguished
Dec 13, 2006
6
0
18,510


I have used both Pro cards and gaming cards for CAD work. Gaming cards are a waste of time. Like I mentioned before, having more than 1 part open at a time brings a gaming card to its knees.

From Autodesk's site: (http://www.inventor-certified.com/graphics//faq2.php?n=SlowMultiWin&index=19#SlowMultiWin)
"This is common with graphics cards/drivers that are not listed as having Full CAD functionality. Graphics cards/drivers designed for the CAD industry are designed to support a large number of OpenGL windows with only a very small loss in performance. Cards/drivers designed for the computer game market do not have this functionality. They may even be limited to just one hardware accelerated window."

That's the OpenGL argument. But for D3D accuracy there is still an argument as in the example below.

Measurements are made with software tools but are graphics dependant. Inventor for example calculates a chamfer dimension in a drawing from what is displayed on screen. The accuracy of the display is important.

See the response from Autodesk (http://discussion.autodesk.com/thread.jspa?messageID=5266666).

If you are a gamer and need to use the occational CAD software in a tethered mode or have a student copy, a gaming card may suit your needs. But I need a workstation card at work to get things done. I do not play games at work and as I said before time is worth more money than an initial capital investment.
 

niz

Distinguished
Feb 5, 2003
903
0
18,980
Which gaming card did you try compared to which workstation card?
My guess is you're not comparing like for like.
We need actual data for gaming vs. workstation card with same amount of ram on both video cards, same GPU, same clockspeed etc.
example nVidia 8800GTX ( ~ $500) and Nvidia Quadro FX 5600 (~ $1500)
I still bet they're both the same.
 

Jimbo1234

Distinguished
Dec 13, 2006
6
0
18,510


I don't remember what gaming card it was, but the ATI FireGL V3100 was much better. I am now using an nVidia FX1500. The nVidia card is faster in OpenGL and has less driver issues in the /3GB mode in XP Pro. Soon it will be time to upgrade.

DirectX is exlusively used in Inventor 2008 under Vista, not XP. I forgot to mention that earlier. This is because there is no OpenGL support in Vista as far as I know. Supposedly DirectX 10 addresses CAD issues. But if your software is not supposrted on Vista or does not support D3D, then you really do not have much of a choice.

This is a good discussion from Autodesk on the topic and another from MCAD Online.
http://discussion.autodesk.com/thread.jspa?threadID=483604
http://www.mcadonline.com//index.php?option=com_content&task=view&id=261&Itemid=1

So the latest gen gaming cards (8800) on Vista for D3D enabled CAD software should work. However, with the limited development time dedicated to this so far (about 4 years), I'd stick with XP and OpenGL on a workstation card. I have had problems with D3D in Inventor 11 SP3. Once we get 2008 in the office, I'll see how things go.

It would be nice to be able to use a cheap gfx card and have a few extra gigs of memory, or a faster CPU.

On another note, CAD software is much more complex than you think. There are tolerances, material properties, physical constraints, and all detail all the time. A truck engine in Crysis is just the external detail, not individual pistons, valves, bearings, cams, rods, contraints determining position and relative motion, etc. In a CAD model everything is modeled. Assemblies are also not just engines, but entire cars with thousands of parts, that like the MCAD artile says need to hold water, not just look like they do. How about a model of a Boeing 787? The model must hold tolerances to a 0.0001 inches or better on many parts in an assembly that can be as large as a football field. So until there is data to support that there is absolutely no reason to use a workstation card versus a gaming card, guess what I will continue to use?
 

Ilander

Distinguished
Jul 22, 2007
173
0
18,710
Ah...so, a gaming card probably would be best for my Inventor-using (occaisionally) girlfriend when she's firing up some architectural work at home, so long as she knows not to open multiple buildings at once.

Jimbo...you've been very helpful.
 

TRENDING THREADS