urian

Distinguished
Jul 7, 2004
77
0
18,630
Is DX9.0c the same as DX9?
It says that 6800 Ultra has got DX9.0c support while the x800xt says it's got DX9 support.
 

Terracide

Distinguished
Sep 7, 2003
88
0
18,630
Correct me if I am mistaken, but I believe that the DX9.0c requires 32-bit floatingpoint/blending for full support, and ATi "only" has 24-bit, while nVidia supports 32-bit from FX and forward?
Så the X800's are DX9.0b and the 6800's are DX9.0c.

Terracide

Don't pretend - BE!
 

gobeavers

Distinguished
Jun 7, 2003
446
0
18,780
I think it also has to do with the new Pixel Shader 3.0.

"Go forward until the last round is fired and the last drop of gas is expended...then go forward on foot!" -Patton
 

sweatlaserxp

Distinguished
Sep 7, 2003
965
0
18,980
The X800 series does not support DX9.0c, only the 6800 series. The ATi cards are actually 16-bit floating point, but they draw from a 24-bit color palette.
 
G

Guest

Guest
Your right about the 32-bit part but I dont think the FX's support DX9c. Also the Fx are really slow when running in 32bit precision(look at far cry b4 patch 1.1)they usually have to fall back to lesser precision whereas the GF6 series run everything in 32bit without loosing performance.

Asus P4P800DX, P4C 2.6ghz@3.25ghz, 2X512 OCZ PC4000 3-4-4-8, Leadtek FX5900 w/ FX5950U bios@500/1000, 2X30gig Raid0
 

Spitfire_x86

Splendid
Jun 26, 2002
7,248
0
25,780
FXSUX

------------
<A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A>

<A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig</A> & <A HREF="http://geocities.com/spitfire_x86/benchmark.html" target="_new">3DMark score</A>