***<b>PIPELINES AND TEXTURES</b>***
If cards at that time never had more then one texture per pipeline then they certainly wouldn't have simply gotten confused about what they were saying. Also if it were a mere missprint then the missprint has consistenly been made every other time I see the Geforce2's pipeline design mentioned.
Vimp, I've said over and over that the geforce2 has four pipelines and can process two texels per pipeline.
A texel is a texture on a pixel.
Four pipelines that can process two textures each. That is exactly what I have been saying all along.
What the Geforce2 does NOT have is <b>texture units</b>.
This is key:
<b>A TEXTURE UNIT is not the same as a TEXEL</b>
EACH pipeline of the Radeon 8500 has two TEXTURE UNITS, which are not texels. Think of a texture unit as hardware that can handle a number of textures, or texels. The radeon 8500's Texture Units can handle three texels each.
So if you're describing them in the format "Pipelines x Textures", then a Geforce2 is a 4x2 architecture, and the Radeon 8500 is a 4x6 architecture.
But like I said, nobody uses this way of describing pixel pipeline power anymore. Now, the first number is used for pixel pipelines, and the second number is NOT texels, but TEXTURE UNITS.
If you describe them in the way hardware reviewers do today, you describe them as "Pipelines x Texture UNITS". Then the Radeon 8500 is a 4x2 architecture, and the Geforce2 is a 4x0 architecture. You dig?
That is why the Radeon 8500 is described as a 4x2 architecture.
And that's why the Radeon 9000 is described as a 4x1 architecture in all the reviews, despite the fact that is has 4 pipes and can process 3 texels per pipe.
Look at this chart of Tom's that shows the Radeon 8500 having 2 texture units per pipe that can process 3 texels (textures) each, and the 9000 having 1 texture unit per pipe that can process 6 texels (textures) each.
<A HREF="http://www6.tomshardware.com/graphic/200207181/index.html" target="_new">http://www6.tomshardware.com/graphic/200207181/index.html</A>
***<b>PIXEL SHADER EFFECTS</b>***
As far as pixel shader effects, here is a screenshot of Morrowind with a DirectX 7 card like the Geforce2... no pixel shaders:
<A HREF="http://www.xgr.com/pic.php?img=/Images/Articles/2523/Morrowind23.jpg" target="_new">http://www.xgr.com/pic.php?img=/Images/Articles/2523/Morrowind23.jpg</A>
Now, here is a screenshot of morrowind on a DirectX 8 card, like the Radeon 8500:
<A HREF="http://www.gamespy.com/asp/image.asp?/articles/march02/morrowind/morrowind1big.jpg" target="_new">http://www.gamespy.com/asp/image.asp?/articles/march02/morrowind/morrowind1big.jpg</A>
Now the water looks noticably better in this screenshot, but in actual play it's MUCH better, because what it doesn't show is that the pixel shaded water is MOVING like real water, REFLECTING beautifully while it does.
It can do this because it's behavior is programmed by a developer who programmed the effect. Developers can do amazing things with programmable pixel shaders, but they can not do those same things with the simple shaders in DirectX 7 class cards because those shaders are not programmable. They can only run the effects they were built to.
Here is a shot of LOMAC's water, which also requires a DirectX 8 card to view in the game:
<A HREF="http://www.lo-mac.com/ss/Mirage-Tomcat.jpg" target="_new">http://www.lo-mac.com/ss/Mirage-Tomcat.jpg</A>
Programmable pixel shaders allow for this, but the Geforce2's NSR (primitive non-programmable shader) is not capable of this sort of thing.
As far as motion blur in GTA VS. NFS: Underground, it is a very different thing, and if you saw both games you'd understand why.
The motion blur in GTA3 is approximated using DirectX 7 class effects... basically, re-rendering simple geometry multiple times for the effect (which is very processor intensive)
The motion blur in NFS: Underground is pixel-shader based. That means that the processor does not have to calculate extra geometry to approximate the blur (he pixel shader does it all), but the blur also looks *much* more realistic.
If the developers only allow the blur to appear on DX9 cards that was a design decision by them, perhaps the added programmability of the DirectX 9 spec made it easier to implement than making a backwards compatible DX8 shader blur.
But regardless, a DirectX 7 card does not have the capability to process custom shader instructions at all.
Here, have a look at Tom's chart that shows how the Geforce2 Ti... the last and greatest version of the Geforce2... does not have what would today be considered a pixel shader:
<A HREF="http://www6.tomshardware.com/graphic/20011218/geforce-ti-01.html" target="_new">http://www6.tomshardware.com/graphic/20011218/geforce-ti-01.html</A>
The first cards to have what we consider "pixel shaders" didn't arrive until the Geforce3 and Radeon 8500.
One of the reasons the Geforce4 MX's got a bad rep is because they don't even have pixel shaders... only the Geforce4 Ti's do, not the Geforce4 MX's...
the Geforce4 MX's (the 420, 440, and 460) are only DirectX 7 hardware, and can't show pixel-shaded effects.
Of course, the Geforce 4 Ti series (The 4200, 4400, 4600, and 4800s) DO have pixel shaders, and are true DirectX 8 class hardware.
________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 322/322)</i>
<b>AthlonXP <font color=red>2600+</b></font color=red> <i>(o/c 2400+ w/143Mhz fsb)</i>
<b>3dMark03: <font color=red>4,055</b></font color=red>