Pro Graphics: Seven Cards Compared

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

wiyosaya

Distinguished
Apr 12, 2006
915
1
18,990
[citation][nom]sma8[/nom][citation]Not true I can flash the bios on my 8800 GTX and it will run just like it's workstation cousin. They are using the same hardware but handicapping the consumer card.[/citation]BIOS flashing to a Quadro doesn't work really for the GeForce 8 series. Cause just the BIOS flash doesn't make it a Quadro. The only way to get it to work as a Quadro is by using RivaTuner. And once again, no BIOS will turn it into a Quadro, NVIDIA has made sure that is not longer possible. Yeah, they learn fast from their previous mistakes.[/citation]
Your point?

More to the point, it is the same silicon. Just using some other program to tweak it after flashing bios does not change silicon. ATI and Nvidia are trying the good ole' fashion capitalistic means to make the most money they can from their silicon. So, they sell the same silicon with minor alterations to bios and other software tweaks for 5 times the price to professional market because they figure that the professional market will pay the price for a higher performance that may not be worth it except in the most complicated of 3D models.

Though reviewers insist that a comparison of gamer's cards to "pro" cards is not valid, since it is the same silicon, it would be very interesting to see if professional cards are really worth paying 5 times the price.

Given reviewer's reluctance to do this, in general and especially at Tom's, I really have to wonder whether Tom's is under contract to NOT compare them to gamer's cards and blow the 5 times the price argument out of the water.
 

zajooo

Distinguished
Aug 13, 2008
3
0
18,510
[citation][nom]eodeo[/nom]Which brings me to the ultra high end of dcc world- ATIs latest 48x0 cards. These have 800 unified shaders. They are just wiping the floor with all the cards mentioned in your article put together(!). All of them together (if it were possible) don’t have enough power to compete with even a single new card from ATI.[/citation]

no doubt, at all

[citation][nom]eodeo[/nom]Quote :I wish I would have saved my,$3,500 in Quadro cards and purchased more 8800GT or even ATI 4870X2.


I was trying to cut down on words and still get the main message across. Thus, I failed to mention that no “professional” application can use more than singe GPU- this means no crossfire/ sli support and also no x2 card support. This doesn’t mean that you can’t use these setups with them, just that they won’t utilize the extra GPUs. So games are only going to be using all the extra power, and in truth, only the games will need it.[/citation]

thanx another new fact for me ...

I have done some tests in the future http://www.cgarchitect.com/vb/22068-scene-optimalization-viewing-rendering.html?=#post152361
that later leaded up to write tree articles on different aspects of optimalization, but it is only in slovak ... I will try to find at least some images from testing, and post a link.

eodeo, keep informing people ;-)
 

tai_anjing_lu

Distinguished
Aug 15, 2008
4
0
18,510
[citation][nom]eodeo[/nom]First off, I have to say that this has been a very good review. I naturally, have several things I'd like to complain about. I mean, as a reader, its our god given right to complain and never be satisfied. Right?The good- you exposed and explained each GPU very nicely, noting everyone's gaming counterpart, included specifications for each card and commented on in with high detail.So if everything is so well and good, why do I complain?Simple- the tests. You're using the SPECheatTest. It’s well known that this test is optimized to show that even the crippled "workstation" cards outperform the much hardware superior "gaming" cards. The fact is, at least 3 programs you tested here today don't reflect the "bell" conditions posed by the SPECheatTest. Well, actually its 2. You haven't tested AutoCAD and I cant really comment on other applications as I am not familiar with them. The 2 applications I am familiar with and fully competent to speak on their behalf are 3ds Max and Maya.What do these 2 have in common other than being under the same roof now? DirectX support. ANY application supporting DX is not being crippled in drivers when running it. You will see huge leads of ATI cards over nvidia here. But I'm getting ahead of myself.In all tests you failed to mention that OpenGL is horridly slow, even on these “professional” cards. This is, of course, in the case when you can chose between OGL and DX in the same application. Not only is it slow, but its visually incomplete as it is lacking functions for visual quality display of lightning and shadow conditions that only DX 9.0c can display. I’m willing to forgive you the last one, as you maybe only tested the thing and not checked for visual quality differences. That’s not saying that SPECheatTest you used, can disply these real life conditions.So with these 2 things in mind, it’s easy to see that only idiots or people unable to use DX would use OGL instead. To be fair, you did say that nvidia’s 8800gtx... erm, I’m sorry I meant quadro fx5600 is the best OGL card- and, I agree. Nvidia uses archaic logic in their hardware and OGL fits that pattern perfectly. So, its no surprise that nvidia should win. If you have to use OpenGL for any reason, nvidia is your man... erm company.If however, you’re not stuck on appalling Mac platform or with archaic software that doesn’t support DX, it should be mentioned that ATI has significant lead. Not surprising either, since it has 3x more shaders at the same price. Games can’t use these well most of the time, but this is not the case with digital content creation (dcc) programs, like 3ds Max, Maya or AutoCAD.Speaking of games- You did mention that 8800gtx=5600fx is 2 generations old. You failed to mention that the latest gtx280 has 240 unified processors and you failed to add it to the tests. Not that I think that SPECheatTest would show that its 2x faster, but the fact of the matter – it is. For any of the 3 above mentioned programs, and likely all others that support DX (possibly OGL, but I’m not sure how crippled it is in drivers- more on this later)Which brings me to the ultra high end of dcc world- ATIs latest 48x0 cards. These have 800 unified shaders. They are just wiping the floor with all the cards mentioned in your article put together(!). All of them together (if it were possible) don’t have enough power to compete with even a single new card from ATI.You conclude that ATI is the best deal at 1000$, but you fail to notice and differentiate outdated OGL programs and new D3D. So the ATI card is the absolute undisputed winner of CURENT dcc. Crown can, in no stretch of imagination, go to nvidia- unless you mention that it’s for outdated and OGL programs exclusive- in which case it does get it. Also, I’d again like to mention that the fact that you’re using SPECheatTest, isn’t helping you build your case either. And in addition to all this, you also failed to mention that you have 10x more powerful card for less then 300$ - HD 4870. Difference is solely in drivers and than not every part of drivers, but just the OGL implementation.Which finally brings me to drivers: All the professional cards are 99% same as their gaming equivalents. They differ only in drivers. You said so, and I agree. What you failed to mention is that the professional cards are actually noticeably slower than their gaming equivalents. For stability they say. I challenge anyone to prove that gaming cards are less stable.There are 2 reasons for card instability. 1)Inappropriate cooling or 2) poorly written drivers. Slowing down the card will make it produce less heat and thus (in theory) more stable.Problem with this nice theory is that NO dcc program available today can stress any of the cards mentioned here today beyond their framebuffer capacity. Chances are that your CPU will choke waaay before any of the cards do. This is due to a simple fact, that viewports, like 95% of other things today, can utilize only single CPU core.Which again, brings me back to drivers- as the only other cause for instability. Here is another interesting fact you might not have known: People writing drivers for the 99% identical cards don’t do it 2x. Process may vary, but in a nutshell it’s like this: they write the drivers for the gaming card. At this point they write 2 different software types in it, either games or dcc software. The dcc software drivers go to more testing on the dcc software and in 99% of cases it’s done there and published. It will be thoroughly tested to notify that there aren’t any major bugs and than shipped.The games driver path will not end there. The driver programmer has one more duty: to cripple performance under what he deems to be “professional” software(!!!). So to reiterate this: the programmer of drivers, instead of perfecting them to be better, actually sits down and starts writing core to CRIPPLE(!?!)the gaming line of the cards. One would image that he could spend his time employing his talents elsewhere.So, crippled or not- drivers are the same 99%. If instability is brought on one line of the cards, the other isn’t spared with "superior" drivers. So in, reality, the workstation cards are no more stable than their gaming siblings, even if many would like you to believe that.In conclusion, I’d also like to nitpick and the fact that you use very low resolutions for testing- capping at 1600x1200. As you might have guessed, anyone interested in working 3d will start at that resolution minimum. Not end on it. This is not a serious overlook as you have been using the SPECheatTest to test everything, so your results should be taken with a grain of salt anyway. Maxtreme was once useful. Some 8 or so years ago. Ever since DX entered dcc programs OGL and Maxtreme and the likes of it have been dying. Last couple iterations of maxtreme were nothing more than DirectX with a fancy name to make qudro buyers happy and make them feel special. It held no visual or speed advantages whatsoever. Than again, it introduced no ill-effects ether.Honestly, I haven’t tried r11, but I seriously doubt it brings anything new (since its not technically possible). And as for OGL support in it, just goes to show how much they know about 3ds Max. With OGL you cannot enable viewport shadows or any of the advanced viewport lightning techniques possible only in D3D. So, as I said before, OGL is not only seriously slower (and I mean seriously) its also lacking much in the visual quality department as well. I’ve said it before and I’ll say it again: only idiots or people unable to use DX will opt for OGL.And when you say slowly you sound like its 2001 all over again. Newsflash its 2008- DX has been a defacto standard for about 7 years now. True- not all dcc will be reflecting this- like AutoCAD that had its first DX support added with launch of Vista. But let’s not kid ourselves here- these software are really only crippled version of actual dcc leaders like Max, Maya, XSI, Lightwave... And let’s be serious for a moment- those software work beautifully with 4 generations old hardware because of their inherent purpose.Thank you for reading,That was short... maybe I should have published it [/citation]
Nice bullshit talking you skinny punk as if you are computer god. you obviously know nothing of computer hardware at all whatsoever. You just computer idiot. My ati rage will blow away your shitty pro card
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
I have done some tests in the future

That would have been cool, if not a typo ;)

sorry I ment in the past :-D

Yeah….. I know… I wish it was in the future :D

that later leaded up to write tree articles on different aspects of optimalization, but it is only in slovak ... I will try to find at least some images from testing, and post a link.

I’m from Serbia, so we’re kinda neighbors :)

eodeo, keep informing people ;-)

I plan on doing so.

Nice bullshit talking you skinny punk as if you are computer god.

Damn! My cover is blown. I’ll have to use a different nick now, to hide my true identity. Maybe MasrerOfAllThingsElectrical will confuse more people. What do you think?

you obviously know nothing of computer hardware at all whatsoever. You just computer idiot. My ati rage will blow away your shitty pro card.

Owww... Do I sense an angry Mac user, hating MS so much that he’s actually afraid to accept that his platform is even more decayed than he initially feared? Owww...

If you don’t trust me, feel free to download trial versions of any programs I mentioned. They’re free for 30days and you’ll have plenty of time to verify my claims. Oh.. but you cant test it on consoles or Macs, you’ll need to use a grownup computers for this- PC… and MS windows on it.

But don’t worry; you can use your 2x more expensive PC and very handy bootcamp it apparently comes with, so you can install MS windows on it, so it will be less bad 2x more expensive PC. Hooray!

Another job well done.

Did you know that I grew up in Europe, where the history comes from?
 
G

Guest

Guest
eodeo Have you ever considered applying for a job at Tom's? Your first comment was fantastic. Very useful info ;)
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
Your first comment was fantastic. Very useful info

Thank you :) I did try to make it short and sweet. I seam to be writing a lot lately. I seam to enjoy it too- weird. I would never have considered your question seriously, just a few months ago.

eodeo Have you ever considered applying for a job at Tom's?

Actually it did occur to me, but I don’t know how. Also, as you might have read in the post above, English is not my native language. Still, I never write anything in Serbian, and as of lately, it seems that I’ve been writing a lot in English :)
I find it strange, but it seams I have a lot to say, and I think that there are many who could be interested in hearing what I have to say. Also, I’ve been an avid THG reader for a long time now. It does seam like a good combination.
 

Oded Erell

Distinguished
Aug 27, 2008
1
0
18,510
Hey guys,

The Quadro FX 3700 is missing from the benchmarks,
And as I understand, it should really be the card to be compared with the FireGL 7700.

It's a little cheaper, but I wonder if it's also weaker...
 
G

Guest

Guest
Hello guys, any idea of new test about AGP Dual-Core CPU?

Thanks!
 
G

Guest

Guest
hi,
Hey, does anyone know how dual cards stack up? its not as simple as the quadro fx 1700 scoring 3.11 in the SPECcap 3ds max test means that dual 1700s would score 6.22 is it?

i'm asking because i recently bought a refurbed workstation that already had dual 1700s in it (it was before i saw this article, and the refurb was a good price, but i had to take it as is). so i was wondering if i should pull those out and sell them to subsidize a single, more powerful card...?

i didn't think it mattered if the software vendors (avid/autodesk etc) write code for dual cards does it....i was always under the assumption that the graphics load was sent out the the OS and the OS would disseminate the work load across however many GPUs were available to do the work...?

anyway, even if i stick with the dual 1700s it'll still be a massive upgrade from my (very) old workstation with the radeon x800 pro on an AGP bus! :)
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
Hello guys, any idea of new test about AGP Dual-Core CPU?

Don’t hold you’re breath on AGP. PCIE 1 is slowly being phased out. AGP did that in 2006, I don’t expect it to go out again- seeing how it never returned.

Hey, does anyone know how dual cards stack up?

I was trying to cut down on words and still get the main message across. Thus, I failed to mention that no “professional” application can use more than singe GPU- this means no crossfire/ sli support and also no x2 card support. This doesn’t mean that you can’t use these setups with them, just that they won’t utilize the extra GPUs. So games are only going to be using all the extra power, and in truth, only the games will need it.

I realize i write a lot, but most of it is useful... i think.

btw: fx1700=8600gts
 
G

Guest

Guest
Can I produce a Pixar, Blizzar or Blur quality image with a Geforce card, without paying the steap price of a Quadro? Do you guys know?
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
Can I produce a Pixar, Blizzar or Blur quality image with a Geforce card, without paying the steap price of a Quadro? Do you guys know?

Quadro produces exactly the same picture quality as it is exactly the same card as its geforce equivalent, with a new name and a bigger price. That said- GPU was never involved with final rendering and cinematic quality depends on the renderer and CPU and not even 1% on GPU. GPU is just for previewing purposes in viewports only. Having a better GPU will only allow you to see your scene faster in real time. That is all.

Now, nvidia with acquisition of mentalimages will try to implement mental ray renderer to their CUDA code and that should allow (relevant) GPU rendering for the first time. When done, it would make rendering much faster (since GPU is 100x faster than CPU- in general) and it will make your GPU do something, if not most of the “things” while rendering final quality render.
 
G

Guest

Guest
hey eodeo!
nice things you write here, but there are many real PRO applications that use OGL exclusively (because they are multi-plattform -> linux/mac/win) to name just a few: maya, nuke, fusion, autodesk line (toxik, lustre, inferno etc.) and even adobe after effects.. so geforce is probably better in 3ds max because max is win only and viewports are (now) based on D3D, but for real OGL apps - if u want performance - there is no way other than fireGL/quadro! do you agree?
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
Maya has superb dx support. Its only on mac that it lacks both features that trademark a modern 3d app- d3d and 64bit.

there is no way other than fireGL/quadro! do you agree?
well... if you dont want to softmod your card to its quadro/firegl equivalent, than yes- i agree.
 
G

Guest

Guest
it has DX support? how do you enable that for the viewports?!?!?
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
it has DX support? how do you enable that for the viewports?!?!?

Its on by default. You actually have to go out and manually set it to opengl in the settings if you wanted to. It has been like this for quite some time now.

Naturally this is only in windows, the only place where maya 64 bit/d3d exists. Other platform are not as lucky.

You have to work under windows/d3d if you want real time lightning/shadow effects. Win/d3d is the new de facto standard, much like opengl was in the old days.
 

xevious75

Distinguished
Sep 19, 2008
1
0
18,510
Hi all,
This is my first post here, so please bear with me!
I went through all the thread and still have some questions for the big experts here ;-)
1. So OpenGL 2.1 is now is trash... what about new OpenGL 3.0? Is it fast/complete enough to be comparable with d3d?
2. Most of the coffee I currently drink standing in front of my PC gets prepared while 3dStudioMax renders... (notebook qith dual 1.83CPU and geforce go 7300. 1GB -enough not to swap- ram) I'm now planning to reduce coffee consumption by buying a new workstation. I have 1000/1500€ to spend on it (that's not much, I know) and I wonder whether I should go for a fast GPU or CPU. Also, will 3DSMax taking more advantage froma fast dual core or a mainstream quad?
3. Ok: technically, pro cards are nothing but a theft. Now if I buy a mainstream card (say a 9800GT/GTX or a 3870/4850) will it run better than a same-costing Quadro 570 or FireGLV3600 in 3dsmax? If not, will I be able to "mod" it? I've herad of modding, but never tried...
4. Am I going to take advantage of CUDA if I buy an Nvidia card now? Consider I have to render and I use 3DSMax.
5. Vista is horrible. Can anyone kill Bill for me please? :-D Seriously: with the new PC I'll get a fancy windows vista to use. Should I ask the 64Bit version? Struggle to downgrade the PC to XP?

Thanx a lot!!!
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
1. So OpenGL 2.1 is now is trash... what about new OpenGL 3.0? Is it fast/complete enough to be comparable with d3d?

http://www.tomshardware.com/reviews/opengl-directx,2019.html

2. Most of the coffee I currently drink standing in front of my PC gets prepared while 3dStudioMax renders... (notebook qith dual 1.83CPU and geforce go 7300. 1GB -enough not to swap- ram) I'm now planning to reduce coffee consumption by buying a new workstation. I have 1000/1500€ to spend on it (that's not much, I know) and I wonder whether I should go for a fast GPU or CPU. Also, will 3DSMax taking more advantage froma fast dual core or a mainstream quad?
Quad is much better for max.

1500e i much more than you need for a complete(!) super rendering computer. To see the exact configuration i recommend, please see my post here:
http://area.autodesk.com/index.php/forums/viewthread/13386/#70544

3. Ok: technically, pro cards are nothing but a theft. Now if I buy a mainstream card (say a 9800GT/GTX or a 3870/4850) will it run better than a same-costing Quadro 570 or FireGLV3600 in 3dsmax? If not, will I be able to "mod" it? I've herad of modding, but never tried...
If you plan on using 3ds Max, Maya, AutoCad or many other 3d programs that support direct3d, gaming cards are faster by far. Their workstation equivalents cant catch up anymore (in general).

4. Am I going to take advantage of CUDA if I buy an Nvidia card now? Consider I have to render and I use 3DSMax.
No. CUDA is a buzzword now. If you’re using Mental Ray, there is a good chance that your nvidia card will allow you to render MR on GPU- one day. Till than, ATI has better value.

5. Vista is horrible. Can anyone kill Bill for me please? :-D Seriously: with the new PC I'll get a fancy windows vista to use. Should I ask the 64Bit version? Struggle to downgrade the PC to XP?
Vista is horrible for Max. Not because Vista is bad, but becase Max is written for XP and has a very poor port to dx10/ Vista. When using DX10 capable hardware under Vista, Max 3d performance is 98% slower than its supposed to be.

You should “struggle” to upgrade to win xp x64.

Good luck
 

silkey

Distinguished
Oct 6, 2008
3
0
18,510
OK folks.

I have just bought 2 flagship Dell workstations (T7400 with dual Quad xeons running vista 64) with SLI and have 2 8800GT's and 2 Quadro FX 3500's. We tossed a coin to see who would win the 8800GT's and I lost so I end up with the Quadro FX. The main applications we use are 3d studio max and UT3 and on previous tests we found the gfx cards to be comparable on 3d studio max but not on UT3 (the 8800's win on gaming hands down) but with the nvidia 169.96 driver on my machine running 2 x 3500's my machine flies within 3ds max (testing high geometry scenes) whilst my colleagues machine is lagging terribly running the 2 8800's. we have spent all day trying to work out what to do, possible option seams to be sell the 8800's and buy more quadro's, who knows?
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
whilst my colleagues machine is lagging terribly running the 2 8800's.

I’ve said it many times and it appears that I’ll have to do so a couple of times more: problem is in Vista working with dx10 capable hardware.

machine running 2 x 3500's my machine flies within 3ds max

qfx 3500 = 7800gt = dx 9.0c hardware= lacking dx 10 support = working fine in Vista.

Vista doesn’t utilize dx10 capable hardware as it should right now. It uses about 2% of speed that dx 10 card is capable of.

Workarounds are – use Vista with legacy-non-dx10-hardware, OR use the vastly superior OS- Win XP x64, either way.

Oh yes, I forgot to mention that switching on SLI seems to make no difference to 3ds max.

For every time I’ve said Vista is bad for Max I tossed in a “SLI doesn’t work in Max and 99% ‘professional’ applications”. This goes the same for crossfire and x2 cards. All additional GPUs are wasted- quadro or other.
 

silkey

Distinguished
Oct 6, 2008
3
0
18,510
Thanks for that. I'll tell my colleague he has to go back to xp64 or give him one of my Quadro's, either way the SLI setup time (incidentally a lot) and cost was a complete waste of money!
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
You're welcome. I also discourage people of getting SLI/xfire/x2 for games as well. Its just not nearly as cost effective as getting the best price/performance card once a year; that, and its completely useless for most professional programs.
 
Status
Not open for further replies.