Maxtreme was once useful. Some 8 or so years ago. Ever since DX entered dcc programs OGL and Maxtreme and the likes of it have been dying. Last couple iterations of maxtreme were nothing more than DirectX with a fancy name to make qudro buyers happy and make them feel special. It held no visual or speed advantages whatsoever. Than again, it introduced no ill-effects ether.The add-on "Maxtreme 11" driver from Nvidia is also interesting. This plugin was developed specifically for 3D Studio Max, and leads to a significant performance boost in this program. In contrast to the previous versions, Maxtreme 11 supports OpenGL and also the DirectX API. The hardware shader operations of 3DSM especially benefit from it.
And when you say slowly you sound like its 2001 all over again. Newsflash its 2008- DX has been a defacto standard for about 7 years now. True- not all dcc will be reflecting this- like AutoCAD that had its first DX support added with launch of Vista. But let’s not kid ourselves here- these software are really only crippled version of actual dcc leaders like Max, Maya, XSI, Lightwave... And let’s be serious for a moment- those software work beautifully with 4 generations old hardware because of their inherent purpose.But here we recognize that DirectX is slowly becoming acceptable in the workstation sector, which was previously reserved exclusively for OpenGL.
This is not true. Improperly written applications, improper power(for whatever reason), and hardware compatibility are just three. I can probably think of more.There are 2 reasons for card instability. 1)Inappropriate cooling or 2) poorly written drivers. Slowing down the card will make it produce less heat and thus (in theory) more stable.
I was trying to cut down on words and still get the main message across. Thus, I failed to mention that no “professional” application can use more than singe GPU- this means no crossfire/ sli support and also no x2 card support. This doesn’t mean that you can’t use these setups with them, just that they won’t utilize the extra GPUs. So games are only going to be using all the extra power, and in truth, only the games will need it.I wish I would have saved my,$3,500 in Quadro cards and purchased more 8800GT or even ATI 4870X2.
That would seam interesting- but as far as 3ds Max goes- d3d under Vista is horrible. My 8800gtx uses less than 5% of its strength in Vista. It’s so much slower that it's actually slow(!). This has to do with the fact that 1) Vista uses DX10 primarily and Max 2009 just got full dx9.0c support/ features or 2) AutoDesk just did a poor job with the Max/ Vista implementation.I would like to see some benchmarks with Current GeForce 280/ATI 4870 cards vs these Pro Cards under Vista DirectX10.
Yeah, but the thing with all of the 3 things you mentioned is going to affect both gamer and pro cards equally. That just falls into “vis major” and there’s nothing you can do about it. Debunking the belief that pro cards are inherently more stable was my only intent. Truth is, a squirrel can chew of your power line and crash your whole system, but its not going to matter if you have a quadro or a geforce insideThis is not true. Improperly written applications, improper power(for whatever reason), and hardware compatibility are just three. I can probably think of more.
Good to hearAs for the rest of your comments . . . very imformative, thanks a bunch
My whole point of the above written is to show that you shouldn’t want to turn your geforce into a quadro, even if you could. No reason to do so, really. Apples to apples, pro variants are actually slower than their gaming equivalents- because manufacturers slow them down on purpose- for "stability".BIOS flashing to a Quadro doesn't work really for the GeForce 8 series. Cause just the BIOS flash doesn't make it a Quadro. The only way to get it to work as a Quadro is by using RivaTuner. And once again, no BIOS will turn it into a Quadro, NVIDIA has made sure that is not longer possible. Yeah, they learn fast from their previous mistakes.
I find it very interesting that you chose to mention ATI 9600 Pro. I just wrote about my old ATI card recently at the official Max forums, Area- here:A few years ago when a buddy of mine built his P4 system we used a certain namebranded motherboard, coupled with an ATI 9600 pro which was factory overclocked.
I find out of my 6 computers running pro apps the ones running the Quadro cards crash more often than my GeForce 8800GT computer. I big one is transparency windows in apps like AutoCAD 2009 or with flickering in the display when 3D rotating objects to AutoCAD/Max. I find the 8800GT ($200) with the extra memory can handle the complex models with textures alot better than my Quadro 1500 ($600). I would have to get a $1500 + Quadro card to even try and compare it to the 8800GT. OpenGL is a thing of the past and will be completely obsolete in the next few years.Apples to apples, pro variants are actually slower than their gaming equivalents- because manufacturers slow them down on purpose- for "stability".
That sounds really good. But Apple/ OpenGL are not stupid. They’ve seen that they’ve been overrun and they plan to release OpenGL 3.0 by the end of this year. As much as I don’t like OGL now, competition is good for the end costumer and letting D3D rule unchallenged is a bad thing in the end. Sloth is common when ruling unchallenged, so competition is good and it looks like we’re going to get some. How well will it perform and what addition will it bring, we will have to wait and see.OpenGL is a thing of the past and will be completely obsolete in the next few years.
So I’m wrong and it’s out. Now we only have to wait for OGL 3.0 compliant hardware. Or maybe not. The latest hardware is by ATI with DX 10.1 and shader 4.1 along with OpenGL 2.1. Will OpenGL 3.0 routines work on 2.1 compliant hardware? I don’t know.and it was released August 11, 2008.
There is no difference to be observed here, save for the drivers pushing the cards. And not drivers in full, just the parts that cripple the gaming cards.I think this is always going to be a debate between Pro cards vs gaming cards until people take them into real world performance when working on a project side be side and really notice the difference.
That depends on what software area you using. For me, Vista is unusable at the moment for anything but dx10 games, as it slows down my 8800gtx so much that its actually slow in 3ds Max. Win XP x64, on the other hand, allows me full GPU speed along with rock stable OS that fully supports all of the 16 exabytes of ram and thus all of my 8gb.We are switching all computers to Vista 64bit as XP64bit is junk and we require the 8+GB of RAM that's why these benchmarks should also have a Vista 64bit system and see how the tests come out.