• Find an incredible deal for Black Friday or Cyber Monday? Share those epic bargains with the community by posting them in this forum thread!

FirePro V3900: Entry-Level Workstation Graphics

Status
Not open for further replies.

djjoejoe

Distinguished
Mar 16, 2012
2
0
18,510
0
If a large difference between a workstation card and a gaming card can be the drivers, does this apply to gaming performance as well? Does the workstation GPU preform similar to the desktop equiv or higher thanks to 'better' drivers. Is it just a case of drivers being optimized for things that end up not applying to gaming, thus any sort of performance increase only applies to none gaming applications?

Just curious :)
 

SpadeM

Distinguished
Apr 13, 2009
283
0
18,790
2
I'm curious about some things, can you pop one of these cards in a pc running next to a 560Ti used for gaming? And then exchange the output in the back of the pc and select the workstation card to use to render in max or maya? Or do you have to reboot every time you set the video output?
 

RazorBurn

Distinguished
Feb 7, 2011
65
0
18,630
0
Both Cards run on the same Hardware, its just that Professional Video Cards have their Drivers optimized to CAD/CGI, etc.. Its like two same SUV cars with Same Engine, but with different tires, one with plain road tire and the other has snow tires.. SUV with a snow tires will certainly run better in snow terrain that the plain road tire SUV..

CAD apps like AutoCAD had Optimized code to run better on Professional Video Cards because the Optimized code in the Drivers.. Unlike Gaming Video Cards which has Optimized codes for Games but not on this CAD apps..
 

Olle P

Distinguished
Apr 7, 2010
531
10
19,015
23
It would be nice to see one or two games thrown into the test.
Just for the heck of it, and also to answer the question:
- Which card is the better choice for my work station if I'd also like to run a game or two during the lunch break?
 
G

Guest

Guest
These clowns need to be brought into court for this intentionally crippling of desktop GPUs, and price fixing of workstation cards.

This travesty needs to stop.
 
G

Guest

Guest
Exactly. With how often the question is asked, "How well will this or that pro card perform in games?", I can't believe at least one or two game benchmarks weren't included.

I'd especially like to see some benchmarks on mid-range pro cards.

Also, same question as above, can I use a Profession CAD graphics card along side a gaming card and get CAD benefits on one monitor and gaming on the other.
 

EDVINASM

Distinguished
Aug 23, 2011
247
0
18,690
2
[citation][nom]MarriedMan[/nom]Exactly. With how often the question is asked, "How well will this or that pro card perform in games?", I can't believe at least one or two game benchmarks weren't included.I'd especially like to see some benchmarks on mid-range pro cards. Also, same question as above, can I use a Profession CAD graphics card along side a gaming card and get CAD benefits on one monitor and gaming on the other.[/citation]

Unless your motherboard supports PCI Express slot switch off via software you can't. Even if it would, you would need to restart. Plus knowing AMD driver compatibility and reliability I wouldn't even hope atm. If you are gaming a lot and doing a lot of 3D, question is, what is more important to you, games or 3D content creation? If you are just beginner and doing CAD for fun, you will get by with gaming GPU. Otherwise, you must be making money on your projects and you should afford mid-high GPU for CAD.
 
Nice article and thank you!

Holly cow, you weren't kidding when you said 'Entry Level', this is more like 'Impoverished Level.'

To me an entry level are sub-$400 cards; nVidia Quadro 2000 series and AMD FirePro v5800. Obviously, Pro GPU's are tailored for their use.
 

Microgoliath

Distinguished
Dec 13, 2011
113
0
18,680
0
This is just to make more money, I'm pretty sure they can mix both drivers (obv gna be a bigger driver then) containing both codes to optimize both gaming and CAD related stuff since both gpus use the same hardware.
 

EDVINASM

Distinguished
Aug 23, 2011
247
0
18,690
2
[citation][nom]Microgoliath[/nom]This is just to make more money, I'm pretty sure they can mix both drivers (obv gna be a bigger driver then) containing both codes to optimize both gaming and CAD related stuff since both gpus use the same hardware.[/citation]

Key word is support. Try to reach support with your 7 series GPU and then try the same when you are professional CAD user with CAD dedicated FirePro.
 

fuzznarf

Distinguished
Sep 22, 2011
120
0
18,680
0
This should have included some sort of Blender benchmark. Not everyone who uses a workstation is a cad designer. Some of us do modelling and rendering with Blender.
 

fir_ser

Distinguished
Apr 7, 2011
739
0
18,980
0
It’s good to hear that tom’s is going to include workstation graphics cards to its charts, hope they will include the high end ones such as the Nvidia Tesla.
 

EDVINASM

Distinguished
Aug 23, 2011
247
0
18,690
2
[citation][nom]fuzznarf[/nom]This should have included some sort of Blender benchmark. Not everyone who uses a workstation is a cad designer. Some of us do modelling and rendering with Blender.[/citation]

Blender is a free tool. Hardly AMD would be spending money to optimise for freeware.
 

fuzznarf

Distinguished
Sep 22, 2011
120
0
18,680
0
[citation][nom]edvinasm[/nom]Blender is a free tool. Hardly AMD would be spending money to optimise for freeware.[/citation]
its not about optimization for a free tool.. the cost of the tool isn't relevant. it is probably the most used tool in the graphical modelling/rendering world. hence a benchmark would be nice. Like i said, not everyone is build 3d engineering schematics with CAD.
 

CaedenV

Splendid
[citation][nom]edvinasm[/nom]Blender is a free tool. Hardly AMD would be spending money to optimise for freeware.[/citation]
Blender may well be a free tool, but it is amazingly powerful and many large companies use it with their own UI and plugins for very large projects. It is used from everything from movies to video game design, and it would be very nice to see how it stacks up.
 

Onus

Titan
Moderator
Joshkorn beat me to it; this is perhaps the one time where the question "but can it play ?" is relevant. I suspect, however, that a pro doing design work on a system containing one of these (or perhaps a more upstream workstation card) isn't likely to have much trouble affording an entirely separate rig nearby just for games.
 

CaedenV

Splendid
Interesting article, but kinda strange as well. These cards are not really made for doing design work as much as they are for managers and other non-techs to view other people's projects for review. Still, I enjoyed reading the article and would love to see followups on higher end products. I would especially love to see comparisons between gaming GPUs compared with their workstation cousins. I know many of the workstation cards are very similar hardware that is underclocked for stability, or with ECC, and simply have a different driver, while other architectures in the past have been vastly different from the more civilian cards.

Still, if you are making any amount of money doing this kind of work I am pretty sure you would be spending a minimum of $250 on your card, and likely somewhere in the $500-1000 range because it is the bottleneck of your productivity and the main determining factor on how many projects a person can do in a year.

Lastly, I would love to see how this card scales on different hardware to see how much was the $100 GPU, vs how much was due to running a duel CPU setup ;) Something tells me that most computers this particular card would go in are very small desktops with 4GB of ram and a duel core CPU.
 

fuzznarf

Distinguished
Sep 22, 2011
120
0
18,680
0
[citation][nom]caedenv[/nom]Lastly, I would love to see how this card scales on different hardware to see how much was the $100 GPU, vs how much was due to running a duel CPU setup Something tells me that most computers this particular card would go in are very small desktops with 4GB of ram and a duel core CPU.[/citation]

Not so much. When rendering and encoding, memory is the 2nd, sometimes the largest, protion of the budget. My current rendering rig sports 8x4 Gb, with a Quadro 400 and an old Opteron dual core. Memory is vital for things like realistic liquid rendering because many times if you are doing high quality renders, it will eat large chunks of memory and crash and fail without enough memory. Many times, before I upgraded to 32g memory I would wake up in the morning after 8 hours of crunching only to find my render worthless. If i were to get a new rig, which I am currently considering hence my initial request for blender benchmarks, I would get at least 64Gb memory, and preferably 128. 2nd biggest item will be vid card. Some things are just impossible without huge memory, regardless of processor.
 
G

Guest

Guest
It would be nice to see in app, real world comparissons, and Cinebench 11.5 as well. I only say this because I've heard Specview makes certain Open GL calls that aren't actually common place in most of these apps but that the pro drivers support. Certain Open GL calls mind you that the graphics cards definitely don't support and slow things down considerably.

Maybe a frames per second test using a high poly scene, with high res textures, an image plane, and an animated character. I know Maya will give you a FPS heads up display, not sure about some of the other apps. If it's Maya, maybe include some Paint FX effects. It would better simulate real world results.

It's just been my experience testing gamer cards vs pro cards with real animation scenes that the actual performance difference is much closer than specview will lead you to believe. At least using Maya, I can't say about some of the other Open GL apps.
 

A Bad Day

Distinguished
Nov 25, 2011
2,256
0
19,790
2
[citation][nom]fuzznarf[/nom]Not so much. When rendering and encoding, memory is the 2nd, sometimes the largest, protion of the budget. My current rendering rig sports 8x4 Gb, with a Quadro 400 and an old Opteron dual core. Memory is vital for things like realistic liquid rendering because many times if you are doing high quality renders, it will eat large chunks of memory and crash and fail without enough memory. Many times, before I upgraded to 32g memory I would wake up in the morning after 8 hours of crunching only to find my render worthless. If i were to get a new rig, which I am currently considering hence my initial request for blender benchmarks, I would get at least 64Gb memory, and preferably 128. 2nd biggest item will be vid card. Some things are just impossible without huge memory, regardless of processor.[/citation]

My high school tried to cheap out on computers' RAM for Autodesk's 3D rendering. Needless to say, it did not work out well.
 

JohnA

Distinguished
Aug 20, 2010
84
0
18,640
3
I've put BF2 on my laptop and run it maxed on the machines 1920 by 1200 display, fx3700m. It gave similar results to the Nvidia consumer card with the same GPU. You get about the same performance in games, but I've noticed the colors look... different. Not better or worse, just different. Often it seems like the picture is saturated, or something similar. The difference is night and day in software that is optimized for Quadro cards. A 400 dollar pro cards beats any consumer card out there by a wide margin.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS