megamanx00
Splendid
The longer nVidia waits, the worse it's position will be. Sure I expect the GT300 cards to be faster in DX10 titles, but what about DX11? Right now it doesn't matter much, but currently developers are using ATI's DX11 cards since they are the only game in town. As developers optimize games to run faster on ATI's cards it will put NVIDIAs offerings at a disadvantage.
If you recall the 2900 launch (I know think back), It often sat between the 8800GTX and Ultra in DX9 benchmarks, but was all but humiliated in the DX10 benchmarks. Early reviews often sided with the 2900, but for newer DX10 games, which were often developed with nvidia hardware, the 2900XT did not, and does not fair as well. This effect lasted to the 3800 series cards and only with the 4800 series cards did ATI fix enough issues and gain enough ground to be truly competitive. Of course, the 4800 series cards are arguably less efficient than their NVIDIA counter parts simply because their arrangement of shaders requires more care by developers to be properly optimized.
So now here is the problem for NVIDIA. The longer they delay, the more time developers have to optimize games for ATI's DX11 offering. The more they optimize for ATI's grouping of 5 shaders, the more it erodes the advantage of NVIDIAs independent shader arrangement. If NVIDIAs more expensive (in terms of transistors, complexity, and of course $) GPUs fail to have a significant advantage in DX11 games, then their margins will be smaller not only for this generation, but for their next as it will probably have to be based on the G300. Admitting the approach of your competitor was right and redesigning a chip completely is expensive after all.
If you recall the 2900 launch (I know think back), It often sat between the 8800GTX and Ultra in DX9 benchmarks, but was all but humiliated in the DX10 benchmarks. Early reviews often sided with the 2900, but for newer DX10 games, which were often developed with nvidia hardware, the 2900XT did not, and does not fair as well. This effect lasted to the 3800 series cards and only with the 4800 series cards did ATI fix enough issues and gain enough ground to be truly competitive. Of course, the 4800 series cards are arguably less efficient than their NVIDIA counter parts simply because their arrangement of shaders requires more care by developers to be properly optimized.
So now here is the problem for NVIDIA. The longer they delay, the more time developers have to optimize games for ATI's DX11 offering. The more they optimize for ATI's grouping of 5 shaders, the more it erodes the advantage of NVIDIAs independent shader arrangement. If NVIDIAs more expensive (in terms of transistors, complexity, and of course $) GPUs fail to have a significant advantage in DX11 games, then their margins will be smaller not only for this generation, but for their next as it will probably have to be based on the G300. Admitting the approach of your competitor was right and redesigning a chip completely is expensive after all.