First off, I have to say that this has been a very good review. I naturally, have several things I'd like to complain about. I mean, as a reader, its our god given right to complain and never be satisfied. Right?
The good- you exposed and explained each GPU very nicely, noting everyone's gaming counterpart, included specifications for each card and commented on in with high detail.
So if everything is so well and good, why do I complain?
Simple- the tests. You're using the SPECheatTest. It’s well known that this test is optimized to show that even the crippled "workstation" cards outperform the much hardware superior "gaming" cards. The fact is, at least 3 programs you tested here today don't reflect the "bell" conditions posed by the SPECheatTest. Well, actually its 2. You haven't tested AutoCAD and I cant really comment on other applications as I am not familiar with them. The 2 applications I am familiar with and fully competent to speak on their behalf are 3ds Max and Maya.
What do these 2 have in common other than being under the same roof now? DirectX support. ANY application supporting DX is not being crippled in drivers when running it. You will see huge leads of ATI cards over nvidia here. But I'm getting ahead of myself.
In all tests you failed to mention that OpenGL is horridly slow, even on these “professional” cards. This is, of course, in the case when you can chose between OGL and DX in the same application. Not only is it slow, but its visually incomplete as it is lacking functions for visual quality display of lightning and shadow conditions that only DX 9.0c can display. I’m willing to forgive you the last one, as you maybe only tested the thing and not checked for visual quality differences. That’s not saying that SPECheatTest you used, can disply these real life conditions.
So with these 2 things in mind, it’s easy to see that only idiots or people unable to use DX would use OGL instead. To be fair, you did say that nvidia’s 8800gtx... erm, I’m sorry I meant quadro fx5600 is the best OGL card- and, I agree. Nvidia uses archaic logic in their hardware and OGL fits that pattern perfectly. So, its no surprise that nvidia should win. If you have to use OpenGL for
any reason, nvidia is your man... erm company.
If however, you’re not stuck on appalling Mac platform or with archaic software that doesn’t support DX, it should be mentioned that ATI has significant lead. Not surprising either, since it has 3x more shaders at the same price. Games can’t use these well most of the time, but this is not the case with digital content creation (dcc) programs, like 3ds Max, Maya or AutoCAD.
Speaking of games- You did mention that 8800gtx=5600fx is 2 generations old. You failed to mention that the latest gtx280 has 240 unified processors and you failed to add it to the tests. Not that I think that SPECheatTest would show that its 2x faster, but the fact of the matter – it is. For any of the 3 above mentioned programs, and likely all others that support DX (possibly OGL, but I’m not sure how crippled it is in drivers- more on this later)
Which brings me to the ultra high end of dcc world- ATIs latest 48x0 cards. These have 800 unified shaders. They are just wiping the floor with all the cards mentioned in your article put together(!). All of them together (if it were possible) don’t have enough power to compete with even a single new card from ATI.
You conclude that ATI is the best deal at 1000$, but you fail to notice and differentiate outdated OGL programs and new D3D. So the ATI card is the absolute undisputed winner of CURENT dcc. Crown can, in no stretch of imagination, go to nvidia- unless you mention that it’s for outdated and OGL programs exclusive- in which case it does get it. Also, I’d again like to mention that the fact that you’re using SPECheatTest, isn’t helping you build your case either. And in addition to all this, you also failed to mention that you have 10x more powerful card for less then 300$ - HD 4870. Difference is solely in drivers and than not every part of drivers, but just the OGL implementation.
Which finally brings me to drivers: All the professional cards are 99% same as their gaming equivalents. They differ only in drivers. You said so, and I agree. What you failed to mention is that the professional cards are actually noticeably slower than their gaming equivalents. For stability they say. I challenge anyone to prove that gaming cards are less stable.
There are 2 reasons for card instability. 1)Inappropriate cooling or 2) poorly written drivers. Slowing down the card will make it produce less heat and thus (in theory) more stable.
Problem with this nice theory is that NO dcc program available today can stress any of the cards mentioned here today beyond their framebuffer capacity. Chances are that your CPU will choke waaay before any of the cards do. This is due to a simple fact, that viewports, like 95% of other things today, can utilize only single CPU core.
Which again, brings me back to drivers- as the only other cause for instability. Here is another interesting fact you might not have known: People writing drivers for the 99% identical cards don’t do it 2x. Process may vary, but in a nutshell it’s like this: they write the drivers for the gaming card. At this point they write 2 different software types in it, either games or dcc software. The dcc software drivers go to more testing on the dcc software and in 99% of cases it’s done there and published. It will be thoroughly tested to notify that there aren’t any major bugs and than shipped.
The games driver path will not end there. The driver programmer has one more duty: to cripple performance under what he deems to be “professional” software(!!!). So to reiterate this: the programmer of drivers, instead of perfecting them to be better, actually sits down and starts writing core to CRIPPLE(!?!)the gaming line of the cards. One would image that he could spend his time employing his talents elsewhere.
So, crippled or not- drivers are the same 99%. If instability is brought on one line of the cards, the other isn’t spared with "superior" drivers. So in, reality, the workstation cards are no more stable than their gaming siblings, even if many would like you to believe that.
In conclusion, I’d also like to nitpick and the fact that you use very low resolutions for testing- capping at 1600x1200. As you might have guessed, anyone interested in working 3d will start at that resolution minimum. Not end on it. This is not a serious overlook as you have been using the SPECheatTest to test everything, so your results should be taken with a grain of salt anyway.
The add-on "Maxtreme 11" driver from Nvidia is also interesting. This plugin was developed specifically for 3D Studio Max, and leads to a significant performance boost in this program. In contrast to the previous versions, Maxtreme 11 supports OpenGL and also the DirectX API. The hardware shader operations of 3DSM especially benefit from it.
Maxtreme was once useful. Some 8 or so years ago. Ever since DX entered dcc programs OGL and Maxtreme and the likes of it have been dying. Last couple iterations of maxtreme were nothing more than DirectX with a fancy name to make qudro buyers happy and make them feel special. It held no visual or speed advantages whatsoever. Than again, it introduced no ill-effects ether.
Honestly, I haven’t tried r11, but I seriously doubt it brings anything new (since its not technically possible). And as for OGL support in it, just goes to show how much they know about 3ds Max. With OGL you cannot enable viewport shadows or any of the advanced viewport lightning techniques possible only in D3D. So, as I said before, OGL is not only seriously slower (and I mean seriously) its also lacking much in the visual quality department as well. I’ve said it before and I’ll say it again: only idiots or people unable to use DX will opt for OGL.
But here we recognize that DirectX is slowly becoming acceptable in the workstation sector, which was previously reserved exclusively for OpenGL.
And when you say slowly you sound like its 2001 all over again. Newsflash its 2008- DX has been a defacto standard for about 7 years now. True- not all dcc will be reflecting this- like AutoCAD that had its first DX support added with launch of Vista. But let’s not kid ourselves here- these software are really only crippled version of actual dcc leaders like Max, Maya, XSI, Lightwave... And let’s be serious for a moment- those software work beautifully with 4 generations old hardware because of their inherent purpose.
Thank you for reading,
That was short... maybe I should have published it