Nvidia Quadro FX 4800: Workstation Graphics At Its Finest?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I would love to see the performance numbers of the card modded as a geforce, and overclocked.

I had utilized some geforces modded as Quadro in Autocad. Nvidia should include good opengl/Quadro drivers in his geforce drivers, and charge an optional premium for support. But there is no reason to buy an really expensive geforce, and have it deeply underutilized with the excuse of giving a "driver support" wich I never utilized.

the real reason of the Quadro premium price, is that ATI and Nvidia have a oligopoly.
If they have more competence, they would charge an unique price. And the 280 is already expensive as a Geforce.
 
"Eh, it would have been more interesting if it was softmodded. I know the GT200's don't take well to being softmodded (appearntly it is possible), but I doubt anyone thought the performance would be the same unmodified."

i'm with you on this on and also why did they compare a 280gtx when you know its faster than the 260 in gaming, that's plan stupid.


also for people who don't know, ring-bus(Ati) is amazing at complex calculations and nvidia with all those little pipelines works very fast in small tasks. hence for folding and viewpec benchmark that smaller the better for Nvidia and larger the faster for Ati. if you making very complex muscle,fluids,CAD,etc use Ati Pro cards (amazing might i add because $150 v3570 outperforms a $650 quadro, best choice i made). for nvidia small models under 5 million polygons or lots and lots of small things.

what i want to see is the whole firepro lineup to the new Qudros based on 200 core. it's simple really, Ati wins maya and solidworks while nvidia ties with the rest.
 
I suppose which workstation card one should get is dependent on the tools they use. Of course since I never really use anything more than blender for some simple 3d flash objects, I probably won't have use for these cards anytime soon ^_^.
 
I think many miss the point of the workstation cards: these are for specialized situations where the cost of the hardware is trivial compared to the cost of the time of the user. Say you're a 3D modeling/rendering pro, and you're charging clients a couple of hundred an hour to do real work. If the workstation card speeds up your work by 20 percent, it only takes a week for it to pay for itself---so from a financial standpoint it's a good deal. On the other hand, if you have some glitches with a consumer card, or it drops a few lines here and there in a drawing, the cost of the person's time to either struggle with the hardware or to fix the actual problem far exceeds the cost savings of the consumer card.

There are only a few CUDA applications that are aimed at consumers, but for people developing specialized things in-house, it can be a lifesaver. And as much as ATI/AMD's propaganda department wants you to believe they have a stake in that market, they don't. No one is using their things right now, because you have to learn specific ATI hardware assembler to do similar things. And as for OpenCL: remember that it takes years to build an installed base, and all of these people learning CUDA (myself included) aren't going to start over in the summer. It will take two or three generations of compilers and drivers to get OpenCL to a stable point---no matter what, since this is the case with all software. CUDA owns the market right now, and will do so for the foreseeable future---becuase the hardware, and more importantly, the installed user base, already exist, and are therefore years ahead of competing technologies.

Finally, I think the Tom's reviewers are making a mistake by not reviewing the Quadro FX 5800. That really is aimed at the top of the market, and calling an article "Workstation Graphics at this Finest" based on nVIdia's second-place card is kind of silly. From a processing perspective, the performance differences between FX4800 and FX5800 should be similar to that between GTX260 and GTX280. But there is another enormous difference: the FX5800 has 4GB of DRAM, so you can physically handle a lot more data that is not possible on the FX4800. For that reason alone, the FX5800 may be more than worth the extra cost, because you can do things with that card that are impossible with the FX4800. A better review would have pointed this out.
 
lol, sad thing is one can easily softmod a gaming card for MUCH less and have the same performance. Only difference is the amount of onboard RAM. Unless you absolutely need more than 1GB of onboard RAM for rendering, you can reduce cost a lot by softmodding. The fact is, most pro workstations are quite reluctent to do a softmod.
 
[citation][nom]MagicPants[/nom]It would be nice if Nvidia would just sell the workstation driver separate from the hardware seems like that would make life much simpler.[/citation]
Then there would be pirated copies of the drivers. More people would be using them. At least now it's not as easy as installing a program. And the GT200 series of cards have made mods very difficult.
 
3-way SLI for PNY QuadroFX 5800, which is essentially the equivalent of a supercomputer, runs Crysis VERY VERY VERY FAST.
 
Yep, nice to see GTX 280 compared. I always assumed workstation cards to be just blatantly overpriced with maybe 25% or so better performance through drivers. Apparently not so.
 
[citation][nom]bob49574[/nom]why do i feel like when everyone compares workstation cards to gaming ones they get it wrong. a 4800 Fx will performs 99% like a 260GTX and if you softmod it to a Quadro than you have the same effect the other way around. really you are paying for driver support. i much rather just pay for the card.[/citation]

Up till the Geforce 6800 series, you could softmod into a Quadro and get 99% of a real Quadro in workstation (OpenGL) apps. But ever since, Nvidia has locked out the card physically. You can still softmod one, but you're still FAR away from a real quadro (we're talking improvements from 30 to 50 fps in Maya, versus 300+ fps).

As of right now, the latest card that can be soft-modded is a Radeon 3870 (into a GireGL 8600 I think). If you want a cheap (relatively) gaming card that can handle light workstation duty, get a Radeon. A 4850/70/90 readily beats the GTX counterparts in OpenGL. Actually, an old 7900GT can beat a GTX 280 in workstation apps!
 
You cannot soft quadro any Nvidia cards after the 6800. Yes, it may come up as a Quadro in Windows, but it will NOT have the Unified back buffer enabled. If you doubt this go to guru3d rivatuner forum.
 
Workstation cards were born in the days when OpenGL was the only serious API for 3D applications, nowadays DirectX has matured to the point where it provides the same and often better performance. Gaming cards fully support DirectX , of course nvidia and ATI would have you believe you need to spend big bucks on a workstation card....

The comparison to the GeForce card is a joke, it has been carried out using the ViewPerf10 benchmark which is an archaic OpenGL test.

The majority of the applications benched in the review will work fine (in many cases better) using DirectX, in this scenario there are few tangible benefits to using a Quadro ofver a GeForce.

There are still some applications that rely on OpenGL as their sole graphics API, in these cases a workstation card is required. These are mostly high end CAD/PLM prodcuts whose lifecycle is much longer than the mid range 3D software market and hence the technology implementation lags somewhat behind as a result.

The reviewer needs to do some real world research rather than relying on benchmarks that do not reflect the actual implementaions of hardware/software today. To dismiss gaming cards as being no good for workstation apps is plain wrong and is misleading.
 
^moward is correct regarding the suitability of DirectX today. Let's hope software heads there in the future. On a different topic, it is important to remember that a big selection factor for 3D CAD apps like SolidWorks, it is not only speed that plays into the choice, but stability. Many gaming cards will run these applications, but rarely are they stable, and free from graphics issues. If you call tech support for SolidWorks on any topic involving stability, the first question is "What video card are you using?"
 
re direct x does that mean that you can select to use directx over opengl in a program like 3d studio max?

im a photoshop guy that wants to get into 3d modelling (stills not moving) and would rather buy a geforce than a quatro due to $$$$

reading this article made me think i might need a quatro but if there is anyway to use the gaming cards i will :)
 
re direct x does that mean that you can select to use directx over opengl in a program like 3d studio max?

im a photoshop guy that wants to get into 3d modelling (stills not moving) and would rather buy a geforce than a quatro due to $$$$

reading this article made me think i might need a quatro but if there is anyway to use the gaming cards i will :)
 
[citation][nom]whitewhale1[/nom]im a photoshop guy that wants to get into 3d modelling (stills not moving) and would rather buy a geforce than a quatro due to $$$$reading this article made me think i might need a quatro but if there is anyway to use the gaming cards i will[/citation]
Whoa, slow down there. You don't need a workstation card to get decent rendering performance when you're learning. You might want to wait until your skills improve, and you start modeling (or animating) complex scenes. If you buy something now you're going to end up wasting something that by the time you'd actually need it, it will be half the price.

I have an Old Pentium D machine that did great work when I was learning Blender, even when I used Raytracing.
 
thats what i was thinking, was just worried i wouldnt even be able to run 3d programs like maya etc properly...these test results made me think i would crash and burn trying to run it

photoshop obviously doesnt require kicka$$ vid cards so im going with the cheap build

thanks mate :)
 
I use Blender myself, and that's free and open source. There is quite a bit of documentation on it, and it offers features inline with other fully featured suites.

http://www.blender.org/

Although it really depends on what you want to do. If you're going into game mods, 3ds MAX would probably be the choice to be widely accessible. But, if you just plan to make stills and animation, Blender works great.
 
How come are they faster? Are the workstation applications written especially to use the GPU on that cards? If so, why couldn't they use a gaming graphics card GPU? What roll does the driver play into this?
 
[citation][nom]bboysil[/nom]How come are they faster? Are the workstation applications written especially to use the GPU on that cards? If so, why couldn't they use a gaming graphics card GPU? What roll does the driver play into this?[/citation]
The Drivers are application specific. Millions of dollars go into development for specific applications. The GPUs are the same, but Nvidia attempts to lock the hardware.

That is what was being discussed. Softmodding, using software to modify the intended behaviour of hardware, allows one to "convert" their card. However, recent measures have made this difficult.
 
Good performance, but I would like to ask one thing about the mobile Quadro FX 3600.

I have HP laptop with this card and it just doesn't perform as it should. In Viewperf10 I get ~same results as the GTX280 in the article, while it should get ~3x higher scores according to some other tested workstation laptops (HP and Dell with the same FX3600).

Yes, I get 8600 points in 3dmark06 and all, but I bought that laptop to work with 3dsmax and it just doesn't deliver.

Can someone tell me, where the problem could be? I am using the same drivers as here (182.08 for quadro cards).
 
Good performance, but I would like to ask one thing about the mobile Quadro FX 3600.

I have HP laptop (8710w) with this card and it just doesn't perform as it should. In Viewperf10 I get ~same results as the GTX280 in the article, while it should get ~3x higher scores according to some other tested workstation laptops (HP and Dell with the same FX3600).

Yes, I get 8600 points in 3dmark06 and all, but I bought that laptop to work with 3dsmax and it just doesn't deliver.

Can someone tell me, where the problem could be? I am using the same drivers as here (182.08 for quadro cards).

Sorry for the off topic, but it seems to be the right place to ask.
 
Status
Not open for further replies.