http://blogs.nvidia.com/ntersect/2010/11/testing-nvidia-vs-amd-image-quality.html
I just read that and I know its a bit biased being from Nvidia's blog but the sources it references are unbiased. I'm wondering what you guys think of this.Getting directly to the point, major German Tech Websites ComputerBase and PC Games Hardware (PCGH) both report that they must use the “High” Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default “Quality” setting in order to provide image quality that comes close to NVIDIA’s default texture filtering setting. 3DCenter.org has a similar story, as does TweakPC. The behavior was verified in many game scenarios. AMD obtains up to a 10% performance advantage by lowering their default texture filtering quality according to ComputerBase.
AMD’s optimizations weren’t limited to the Radeon 6800 series. According to the review sites, AMD also lowered the default AF quality of the HD 5800 series when using the Catalyst 10.10 drivers, such that users must disable Catalyst AI altogether to get default image quality closer to NVIDIA’s “default” driver settings.
Going forward, ComputerBase and PCGH both said they would test AMD 6800 series boards with Cat AI set to ”High”, not the default “Quality” mode, and they would disable Cat AI entirely for 5800 series boards (based on their findings, other 5000 series boards do not appear to be affected by the driver change).