How is a directX version "x" capable card made?

sgtmattbaker

Distinguished
Aug 21, 2009
97
0
18,630
So Microsoft releases a new version of the DirectX API. If people want to use it is is likely that they will need a video card to support it. How is an API made first and then hardware to do what it does? Is the hardware built a way to render images a certain way or is it more DirectX specific. Also, if a card is DirectX 10 and OpenGL does that mean it has "extra" stuff on it? What about one of those workstation quadro fx cards? If you had an opengl game would that be a good gaming card, or is it still designed for different applications?

 
Cards are limited to a certain DX level for two reasons:

The first is hardware; certain DX functions (like tesselation) have an additional hardware requirement.
The second is processing power; newer DX releases tend to have new functionallity that requires more processing power to perfrom.

The DX API basically has the functionallity needed to render 3d images, which GPU's then take in and perform the relevent action for each function call. That being said, the implementation of the API can lead to subtle differences in rendering (Shadows are a good example). So even though two different vendors may both support the same DX call, you may get a slightly different result based on implemntation.

OpenGL is actually an older API then DirectX is; problem is its a pain to work with. But because its free to use (no royalties), many smaller studios stick with it to cut costs. It also has the benifit of working on other non-windows hardware (Linux/consoles) with almost no modification. Most cards support both the DX and OpenGL API's.

Workstation cards will never be good for gaming. Fact is, they are basically the same as their gaming equivalent, but with much more stable drivers, and a +$500 added to the cost.