AMD FirePro V8700: High-End Workstation Graphics

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I also wonder if you would take the FirePro and Radeon cards and use the lastest version of 3dsMax 2009 as a comparison. Why are they using such an obsolete version? I know they would have to re-benchmark all the cards to the same version...but c'mon...version 4? What version of OpenGL is that anyways? That could even be why the new card suffers under these benchmarks?

TOMS HARDWARE...time to upgrade your software!!! 3DSMAX 2009 please?
 
For agrument's sake:

The RV630-based (2600Pro) FireGL V3600 scored 95.15 on the Maya benchmark according to Tom's charts. This is still more than double what the gaming HD 4870 scored on the same benchmark. The V3600 is still available on Newegg for $150. The RV730-based (HD 4650) FirePro V3750 is also available on Newegg for $170 and should (theoretically) be a much better performer than the V3600.

Looks to me like using a gaming card in these apps is like driving with your parking brake on. Even the low-end pro cards kill the high-end gaming.
 
I'd really like to see a desktop card tossed in these test's one day.
Optimised scmoptimised, but do these really provide bang for the buck? Especially at lower end.
 
Great article, good to see those benches for industry CAD applications on a standard gamer card. Nice to know that Ati and Nvidia have reason to charge a massive amount more for workstation graphics. However..at that price you could feasibly get a high end Quadfire, or Tri-SLI configuration. I'm gonna say that there's way more computing potential in multi-gpu solutions, again, a matter of driver optimization.
 
I would like to know how they made 3DS Max 6x faster. It's a DirectX application. Rendering a 3D scene in 3DS Max is no different than rendering a scene in a game. How would the drivers even know what application or game was using it?

Maybe they intentionally cripple the drivers for the 4870?

 
[citation][nom]kansur0[/nom]I also wonder if you would take the FirePro and Radeon cards and use the lastest version of 3dsMax 2009 as a comparison. Why are they using such an obsolete version? I know they would have to re-benchmark all the cards to the same version...but c'mon...version 4? What version of OpenGL is that anyways? That could even be why the new card suffers under these benchmarks?TOMS HARDWARE...time to upgrade your software!!! 3DSMAX 2009 please?[/citation]
Do you have any idea how much this would cost? And all of that for one article that a majority of their demographic couldnt care less about?
Let me rephrase that question into a statement: Not going to happen.
 
Holy crap. could imagine how awesome it would be to flash one of the two bios cards with tis bios?

My PAlit 4870 has a swithc to change between two seperate bioses.

People could buy it, and then flash on ebios with the Firepro rom adn have the Modeling power they desire as well as the gaming potential at a cheap cost. However, flash pro cards has proven to be quite dificult in my expirience...
 
[citation][nom]DXrick[/nom]I would like to know how they made 3DS Max 6x faster. It's a DirectX application. Rendering a 3D scene in 3DS Max is no different than rendering a scene in a game. How would the drivers even know what application or game was using it?Maybe they intentionally cripple the drivers for the 4870?[/citation]

While the drivers CAN detect the application running (Catalyst drivers do this to load CrossFire profiles during games), I too wonder what possible kind of optimization gets 6X performance at the software level. Wouldn't you pretty much have to re-write 3dsMax to get that kind of difference? If a program uses a standard, OpenGL or DirectX, isn't it the hardwares responsibility to handle that? How can two applications use a standard so differently that the drivers make this much of a difference.

Basically, are we to assume that all OpenGL and DirectX translation happens on the CPU with the drivers, which then feeds basic machine-code to the hardware? Is this why some games require such a massive CPU just for eye candy? I was under the impression that a DirectX card meant hardware support, not driver support, but it seems to me the underlying hardware doesn't mean squat in the face of good drivers. Six times performance increase for only a driver change means the drivers are doing too much in my opinion.
 
hey i was thinking of building a gaming pc(desktop) and i am on a budget of about $1200 and i was wondering what graphics cards would give me moderate performance, reliability, and would get me the most for my money. Thanks.
 
[citation][nom]corncake21[/nom]hey i was thinking of building a gaming pc(desktop) and i am on a budget of about $1200 and i was wondering what graphics cards would give me moderate performance, reliability, and would get me the most for my money. Thanks.[/citation]
Here was the messege for my Moderator Alert, Corncake:
"Moron asking stupid questions about advice for his computer in a totally unrelated thread."
 
the difference in V8700 to HD4870, is like a Graphics processor to a General Porpose CPU, these cards are configured to do more, give more performance, with some specific apps like maya or 3dsmax or something, so don't think of any performance gains from these in games! and rendering in a game is so much different from rendering in a workstation application!
and for a bios hack, i don't think so, they are not dumb to put fire on their money again, they probably have done something to prevent a bios hack now!
 
[citation][nom]vaskodogama[/nom]the difference in V8700 to HD4870, is like a Graphics processor to a General Porpose CPU, these cards are configured to do more, give more performance, with some specific apps like maya or 3dsmax or something, so don't think of any performance gains from these in games! and rendering in a game is so much different from rendering in a workstation application!and for a bios hack, i don't think so, they are not dumb to put fire on their money again, they probably have done something to prevent a bios hack now![/citation]
You have no idea what you're talking about. Other people have done such hacks, there are guides on how to do these hacks. It's not hugely difficult to do if you have the right card. The hardware is the same, it's just drivers. Go to the ATI or Nvidia site and look at the drivers for the Work Station cards. They actually write drivers for each application.
 

Did you read the article?!?!?
 
[citation][nom]KyleSTL[/nom]Did you read the article?!?!?[/citation]

My thoughts exactly. Its kinda funny that many people commenting on this article have the whole home user / gamer mentality.

Someone that will buy such a professional card will obviously be able to afford it and he'll get his money back from the workload generated by the card. ATi and nVidia are not crippling their cards or something (gaming ones). They heavily optimize their pro drivers (like REALLY heavily) and they know that their cards will only be used for a handful or well matured applications. Also you should consider the fact that many of the applications themselves have optimized code for FireGLs/Quadros and not gaming mainstream ones. Thus you see up to 6x performance difference.
 
[citation][nom]vaskodogama[/nom]the difference in V8700 to HD4870, is like a Graphics processor to a General Porpose CPU, these cards are configured to do more, give more performance, with some specific apps like maya or 3dsmax or something, so don't think of any performance gains from these in games! and rendering in a game is so much different from rendering in a workstation application!and for a bios hack, i don't think so, they are not dumb to put fire on their money again, they probably have done something to prevent a bios hack now![/citation]
Doesnt matter what they do, and you dont even need a hacked bios to do it. All you would need is a fire pro user to post the bios from their card, then use any one of the dozens of ATI bios tools to flash it in. The only thing you'd really need to do is change memory, if needed, and maybe remove displayport support if it messes things up at all.
 
I get the driver cost issue. Both makers sell massive quantities of gaming cards. They don’t sell nearly as many workstation cards which are tuned for specific non-consumer programs so the drivers will cost more, simple scale of economies. Plus the commercial market can’t afford to wait for a driver bug to get fixed the way consumers do with games it would cost them too much money in down time so more time is spent fine tuning them prior to release. Do you really think either maker would allow a simple hack flashing of the bios to bypass their hard earned work?

Why was the testing done at 1600X1200 and 1280X1024? The 3 display port monitors mentioned in the article are 1900X1200 and 2560X1600. Why no tests at those resolutions? If the software is silly money the workstation costs silly money why would you look at all that money on a 1280X1024 monitor and not nice a large silly money 30 incher?

Thanks for the info on that market segment it’s interesting to see how they transfer gaming technology to industrial use.
 
is it possible to get the best of both worlds by buying a low-end workstation card, and say a high-end gamer's card, then installing them in the same system?
 

Much better method would be to buy a High-end card with a Workstation equivelent, and flash the BIOS to that of the workstation card. Often times, the actual drivers for the system aren't different (it's all about the drivers for applications they are expensive to develop for).
 
Speaking from the standpoint of having developed software for a high-end printing solution, it is highly unlikely that the drivers were separately optimized for these applications and that they sell them separately from the standard drivers. It is more likely that the drivers recognize a flag in the bios that tells the drivers to turn on the optimizations.

Way back when the GeForce II was available, there was a hack that involved repositioning a resistor on the circuit board. If you did that, the card was then recognized as a Quadro II. The "gamer" drivers recognized the card as a Quadro II and all the options for it were available in those drivers.

I would not put it past Nvidia or ATI to use the same sort of methodology at this point, that is, to use the exact same software developed by the same developers for different purposes. It is the most cost-effective and profitable means of doing so. The workstation application market is a "professional" market, and it has deep pockets. That said, it is interesting to note that the article says
but as soon as you load a professional graphics application such as Maya or 3ds Max and import a complex 3D model
. The operative words are "complex 3D model." What level of complexity is a matter of interpretation, however, this implies that simpler models are a different story, and that the difference in rendering quality may not be significant with a simpler model.

So, what does this mean? If you are doing serious engineering work where your models have hundreds or thousands of parts, then these cards are worth the extra money. However, if you are a part-timer, or your models tens of parts, you a "gamer" card may suffice for you.

I own Solidworks 2003, and I have an 8800 GTX 512 MB card. The machine I run it on is a 2.8 GHz dual core Opteron. For what I do, the power here is more than enough. However, it is great to see the comparison against "pro cards" and it is also great to see the price of these "pro cards" come down significantly. I hope the gaming comparison continues in future "pro card" reviews.

I would also love to see a review of these super expensive quadros where they compare them against the identical "gamer hardware."

The question is do you really need the X increase in performance, and is that X increase in performance worth the XX increase in price.
 
The term you all are looking for is "softmodding" and is currently being pursued on the Radeon 48xx line. Unfortunately, the workstation drivers cannot be flashed on Quadro cards, since Nvidia now laser etches them, preventing any such BIOS flash. AMD has not yet followed suit, and many people successfully flashed their 3870's into FireGL's, with performance almost matching the workstation cards, albeit with some bugs (which were mostly sorted out in later revisions). We'll see what happens on the RV770 soon enough.

Hope that helps.
 
Just a comment on the name switch. GL implies OpenGL, while the intro of OpenCL and the use of DirectX are formats this card is also capable of handling very well. So if the name is FireGL, some consumers may misinterpret this as for OpenGL only...as it once was.
 
[citation][nom]Harby[/nom]You're clueless. The price premium is for the drivers themselves, not the hardware. No one crippled your gaming card. But no one optimized its drivers for workstation applications either. And these optimizations are not simple tweaks but massive and careful code to give you massive performance boost under very very specific applications. You could go as far as saying that you're in essence buying an expensive piece of software as well and not just a graphics card.[/citation]

Yay, so that means that ATI's open source drivers should in theory eventually let you get these performance gains in DCC apps from a gaming card.
 
where are the firepro v3700 and v5700 toms posted last august? For goodness sakes it's listed in the related articles side of this one!
 
"Drivers and 3yr support aside, I really really really really would have liked to see the top 4870 1gb consumer card thrown in just for a reference point. It would do wonders to help justify the price markup to my boss that signs the purchase order."

Good luck on that one... THG would never receive, or be able to review, another video card again. Do you seriously think ATI/Nvidia sends out cards and or allows reviews of their hardware without stipulations preventing this from happening?
 
Status
Not open for further replies.