AMD Announces the Affordable FirePro V4900

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
G

Guest

Guest
sigh* it seems this always comes up with every workstation gfx article, if you dont known the difference between a workstation card and a normal consumer card then you dont need one

for those that actually do need one, the majority of the cost is in the driver development, they have specific drivers guaranteed to work on specific applications, and guarantee means a specific version of the application with a specific driver version vetted against a certain hardware setup would never crashed, even if your hardware setup does not conform completely their is a high confidence that operations should be crash free. A integral part of this stability is the requirement that pro cards use specific approved hardware to maximize compatibility, the operational tolerances on these cards are much tighter then your consumer cards

The drivers written for these cards are a complete different beast from your standard games type drivers, they concentrate on quality rather than speed to ensure frame buffer integrity and minimal graphics defect (which is hell annoying if you are manipulating objects), which is why their gaming performance is abysmal, also generally speaking these cards can address a much larger frame buffer then your standard game cards

@lockhrt999
CUDA is all well and good but outside of academia it is rarely ever utilized, accountants dont like the idea of investing time and money developing stuff that will only work on a specific hardware, they would be more open to an open standard but even then that is still a hard sell, and frankly speaking eyefinity trumps CUDA any day of the week for your standard work station operator (if there is such a person)

@wiyosaya
actually i would agree with you completely, but then again you have to understand solidworks (like inventor and it's peers) are entry level CAD systems, targeted at small workshops who have modest budgets and computing hardware to match, i would not expect to come across a fully relational parametrized 100 component sub assembly in such instances. CATIA is it's older brother (and it's almost essential to have a pro card with that beast) and if your just running simple models in CATIA then you obviously have not correctly evaluated your needs
 

halcyon

Splendid
[citation][nom]11796pcs[/nom]Just looked it up myself- here's the answer:http://www.ehow.com/about_5426830_ [...] -card.html[/citation]
Informative, thanks.

[citation][nom]Kensingtron[/nom]Thanks for pointing the typo out though.[/citation]
Look, you made a mistake. It can't be tolerated because the folks here, myself excluded, are all perfect. Hold your wrist out...they'll need to slit it.


So: this is pretty much the same as: this ...not quite, though many of the posts above (and the one below) seem to suggest they are, except for FW and SW.
 



Still can, well not the BIOS but you can manipulate the drivers that get used. Rivatuner has the ability to override the hardware ID that is reported to the system and thus which drivers get installed. I used this program to utilize GeForce 6800 drivers on a Quadra FX card, they were the same silicon just different driver package. Worked like a charm and the gaming bench's are higher, most likely via performance tunes and shortcuts that wouldn't be tolerated in a CAD / CAM / rendering program. There should be a similar tool for ATI / AMD brand cards.

DACSystems,

Your partially correct on the drivers issue. Both NVidia and ATI have gone to a UDA (Unified Driver Architecture) model. There is only a single of kernel drivers but these drivers have various options and settings that are enabled and disabled on a per-device basis. Professional cards and gaming cards are identical HW wise but their HW ID is what tells the OS which driver options to enable / disable. You can fake out that hardware ID and play with those options yourself, many of them will make your card run slower but yield more accurate results, others will make your card run faster in certain scenarios (games, ect..) at the expensive of image quality, typically movements and lightening during fast changing scenes. Trying to run Maya or other professional development programs with a card utilizing gaming settings can create instabilities and artifacts, just like trying to run a game with professional development options turned on will cause the game to run slowly and choppy.
 

halcyon

Splendid
[citation][nom]eddieroolz[/nom]Wait, isn't the price missing a whole place value? A new professional card at only $189!?[/citation]
Oh, don't worry, I'm sure those who buy it are getting what they paid for. /facetious
 

lordstormdragon

Distinguished
Sep 2, 2011
153
0
18,680
Clown shoes. AMD and Nvidia both are blatant frauds with their "professional lines". Anyone who can do math and has used these applications, especially Maya, should already know this.

Hey, you wouldn't buy an Apple for graphics production, would you? Oh, you moron. You did buy one.
 

halcyon

Splendid
This also makes me better understand why Apple "only" uses mainstream GPU in their Mac Pro line. They know that's all they need...though. I would like to see them keep better pace with the higher-performing GPUs...even if only as an option. I can imagine that an HD6990 would be quite tasty in a 6-cored Xeon'd Mac Pro.
 

amk-aka-Phantom

Distinguished
Mar 10, 2011
3,004
0
20,860


It maybe then makes sense why GPU upgrades there cost so much - maybe AMD lets Apple use special drivers for the 5870 to make it work like a pro card? :lol:
 

halcyon

Splendid

Well, I think its wishful thinking...but I like it.
 
Status
Not open for further replies.