The i860 was frequently used as a geometry processor, due to its massive floating point horsepower, for the time. But this is really beside the point.In fact many of the early 3D accelerators are not popular at all, and were considered expensive (since some also includes a RISC CPU on-board for rendering).
This whole line of argumentation seems to miss the point. Even if the PC gaming sector had no standard API (though the middleware packages I mentioned were a big step towards that), there were certainly standard practices in use, at the time. Developers had embraced triangles, pixel fog, and even some Z-buffering - none of which the NV1 supported. And toolchains for game assets were triangle-oriented, not based on quadric patches. Maybe that's why only 1 of those 3 first-party ports shown in your Youtube video actually used them. The other two just used flat quadrilaterals, and I forgot to mention the lack of fog made the draw-in on Panzer Dragoon look horrendous.Also there were no standards, again as mentioned, thus most games will use software based rendering.
The NV1 just ignored what everyone was doing, and went completely in its own direction. Not only was there going to be a learning curve, for anyone who adopted it, but it lacked ecosystem support.
Where?You contradicted yourself.
You should @GetSmart, yourself. You're trying to win an argument out of ignorance, and it's just not working for you.Early OpenGL was only available on high end workstations and very expensive proprietary OpenGL accelerators (from Sun, SGI, Apollo, Integraph , etc). Earliest OpenGL implementations were mainly software based, and only much later hardware based.
OpenGL was developed by SGI, specifically to enable 3rd party developers to more easily leverage their hardware accelerators. It was based on their earlier IRIS GL library, which was already quite popular, before they opened it.
Yes, that's where 3D hardware started, because it was expensive. The reason I brought up OpenGL is that it was an open specification that existed prior to nVidia's founding that established an industry-standard API for hardware accelerated 3D graphics.These were very expensive and targeted for professional use.
Again, I made this point to counter your suggestion that nVidia was operating in a complete vacuum and just got unlucky when MS decided to take Direct3D in a different direction. This shows extreme ignorance of the industry, at the time. Your ego cannot handle this truth.
Your wall of details about the subsequent history of OpenGL on Windows is not relevant to this point, so I won't bother to help you correct them or fill some of the more glaring holes.
This is BS. There are lots of reasons why consumer cards had missing or incomplete OpenGL support, not least because they had a tendency to take various short-cuts that prevented full conformance with OpenGL. And, in the one example you cited of a 3D Labs OpenGL chip that lacked D3D, I'd suggest that's probably less of a technical issue and more of a market segmentation problem.Thus that is main problem with alternative 3D technologies, they are actually incompatible with each other.
But, you don't actually care what the real truth is - you're just hoping to bog me down in a wall of details to distract from the underlying point. I'm not about to fall for that.
Go ahead and waste more time, if you like. However, you'd do well to @GetSmart and heed the advice that "when you're in a hole, stop digging".