I should have defined what I meant by "Image Quality" in this case, what I mean is the ability to see certain PS3.0 exclusive effects.
As far as precision, I don't think it will be a real-world issue. I doubt that 32-bit precision will make much of a difference over 24-bit precision.
In comparative screenshots showing the FX cards display banding effects because of their low 16-bit shader precision, the Radeons display no banding at all.
24-bit should be quite sufficient, IMHO.
I'm not so concerned with the corporate morality either, although I am an Ati fan because of the R3xx's series dedication to both image quality and shader power when Nvidia was resting on their laurels. I do think that Nvidia's futuremark cheating is just plain detestable tho.
But the 6800 shows me their laurel resting is at an end. It'll be a tough call if the X800 is a bit faster in the raw speed epartment though.
As always the card I'll buy when the time comes will probably depend more on the financial aspects, i.e. "Which card is the best in my price range"?
But even if the X800 is a tad faster, I'll probably consider the 6800 the ultimate card if it's shader 3.0 capability proves to be an advantage and it's not available on the X800 series.
For me, speed is important... but if one card is displaying 60 fps with better image quality/features, and the other card is displaying 75 fps with less features, I'll go with the 60fps card. That's my priority, it's why I went with the r3xx GPUs in the last round.
________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 340/310)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @2.2Ghz)</i>
<b>3dMark03: <font color=red>4,055</b>