I do beleive that Nvidia's cards are usually better optimized for the OpenGL API than DirectX, but I wasn't aware that the difference was THIS big.
they AREN'T. opengl itself runs very bad on nvidia cards as well, it uses about exactly the same features/hw as dx9.
BUT
opengl allows the addition of "vendor-specific extensions". and with those, they can add features. once you use them you can gain new performance, or features, or what ever. and in nvidias case, you get access to the nv30. you use one of their extensions and you have to use about all of them, as they are all interwired. then you can get first time good performance on nv30. but this is stupid, as opengl and dx are there to unify hw.
nvidia messes opengl up all the time. it was the same with about every gf1 card and bether. they had RC,TS,VAR,NVFP,NVVP, and more, and they never ever cared on trying to expose their hw in a good, well designed way to opengl instead.
coding for nvidia is like downloading the intel processorspecs. ugly, complicated, proprietary.
and with todays scedules for gamedev's, nobody can pay an additional codepath programmed specifically for proprietary gf1+ hw, one for proprietary gf3+ hw, one for proprietary gf5+ hw. exception is carmack, who takes himself the time to do that. but he doesn't like it as well. it will be his last time where he optimizes for gpu's, he told.
after doom3, nvidia WILL have to make hw that simply runs GOOD in standard situations, or they will fall.
oh, and i see currently tons of people complaining how slow their 5200 is.. from 50fps on my radeon9700pro to 2-3fps on their 5200.. ps2.0, and, because of that, the mainfeature of dx9, is about UNUSABLE on those cards (seen in those benches, too, as they got disabled..
).
nvidia messed up much bigger than i thought.. i'm dissapointed.
"take a look around" - limp bizkit
www.google.com