ATI R420 WONT SUPPORT 3.0 SHADERS

G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

http://www.theregister.com/2004/03/11/ati_drops_pixel_vertex_shader/
ATI 'drops pixel, vertex shader 3.0 support' from R420
By Tony Smith
Published Thursday 11th March 2004 09:59 GMT
ATI's upcoming R420 graphics chip will not support DirectX 9's version three
pixel and vertex shaders.

So claims German web site 3D Center, saying that the absence is "beyond
doubt".

The site argues that since the R420 is derived from the older, proven R300
architecture, it would never have been easy to 'bolt on' pixel and vertex
shader 3.0 support, so ATI instead decided to focus on improving shader 2.0
support. Indeed, it concludes that ATI even believes shader 3.0 support
isn't as important as some gamers and other graphics chip fans might think.
Shader 3.0 is "a beautiful, but rather useless check list feature", the site
says.

Essentially, shader 3.0 support won't be necessary until the next generation
of graphics chips arrives with the upcoming 'Longhorn' version of Windows in
mind. By then there should be much better shader 3.0 support in games, too -
there aren't any yet.

Much better, then, to focus on the technology that today's - and
tomorrow's - games do support, and make it work faster. That means shader
2.0.

The R420 will deliver that through its eight rendering pipelines containing
an unknown number of texture units and six vertex units. The 160 million
transistor chip will be fabbed at 130nm by TSMC. It will support DDR, GDDR 2
and GDDR 3 across a 256-bit interface. It is expected to be used in AGP 8x
boards.

Of course, Nvidia will tout shader 3.0 support when it launches the
long-awaited NV40 later this year. ®
 

Andrew

Distinguished
Mar 31, 2004
2,439
0
19,780
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

On Fri, 16 Apr 2004 19:58:01 GMT, "wired and confused"
<johnsongerry@comcast.net> wrote:

>Published Thursday 11th March 2004 09:59 GMT

Feel free to let us know other things that happened a month ago.
--
Andrew. To email unscramble nrc@gurjevgrzrboivbhf.pbz & remove spamtrap.
Help make Usenet a better place: English is read downwards,
please don't top post. Trim messages to quote only relevant text.
Check groups.google.com before asking a question.
 

minotaur

Distinguished
Mar 31, 2004
135
0
18,680
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

wired and confused wrote:
> http://www.theregister.com/2004/03/11/ati_drops_pixel_vertex_shader/
> ATI 'drops pixel, vertex shader 3.0 support' from R420
> By Tony Smith
> Published Thursday 11th March 2004 09:59 GMT
> ATI's upcoming R420 graphics chip will not support DirectX 9's version three
> pixel and vertex shaders.
>
> So claims German web site 3D Center, saying that the absence is "beyond
> doubt".
>
> The site argues that since the R420 is derived from the older, proven R300
> architecture, it would never have been easy to 'bolt on' pixel and vertex
> shader 3.0 support, so ATI instead decided to focus on improving shader 2.0
> support. Indeed, it concludes that ATI even believes shader 3.0 support
> isn't as important as some gamers and other graphics chip fans might think.
> Shader 3.0 is "a beautiful, but rather useless check list feature", the site
> says.
>
> Essentially, shader 3.0 support won't be necessary until the next generation
> of graphics chips arrives with the upcoming 'Longhorn' version of Windows in
> mind. By then there should be much better shader 3.0 support in games, too -
> there aren't any yet.
>
> Much better, then, to focus on the technology that today's - and
> tomorrow's - games do support, and make it work faster. That means shader
> 2.0.
>
> The R420 will deliver that through its eight rendering pipelines containing
> an unknown number of texture units and six vertex units. The 160 million
> transistor chip will be fabbed at 130nm by TSMC. It will support DDR, GDDR 2
> and GDDR 3 across a 256-bit interface. It is expected to be used in AGP 8x
> boards.
>
> Of course, Nvidia will tout shader 3.0 support when it launches the
> long-awaited NV40 later this year. ®
>
>

Not that there isn't a hope in hell it shall be able to run at more than
5FPS on the nVidia 6800U, in a FPS game. Yes it has PS3 support, but it
runs like a dog, something they didn't tell you :)
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

On Sat, 17 Apr 2004 14:28:07 +1000, Minotaur <antnel@hotmail.com>
wrote:

>wired and confused wrote:
>> http://www.theregister.com/2004/03/11/ati_drops_pixel_vertex_shader/
>> ATI 'drops pixel, vertex shader 3.0 support' from R420
>> By Tony Smith
>> Published Thursday 11th March 2004 09:59 GMT
>> ATI's upcoming R420 graphics chip will not support DirectX 9's version three
>> pixel and vertex shaders.
>>
>> So claims German web site 3D Center, saying that the absence is "beyond
>> doubt".
>>
>> The site argues that since the R420 is derived from the older, proven R300
>> architecture, it would never have been easy to 'bolt on' pixel and vertex
>> shader 3.0 support, so ATI instead decided to focus on improving shader 2.0
>> support. Indeed, it concludes that ATI even believes shader 3.0 support
>> isn't as important as some gamers and other graphics chip fans might think.
>> Shader 3.0 is "a beautiful, but rather useless check list feature", the site
>> says.
>>
>> Essentially, shader 3.0 support won't be necessary until the next generation
>> of graphics chips arrives with the upcoming 'Longhorn' version of Windows in
>> mind. By then there should be much better shader 3.0 support in games, too -
>> there aren't any yet.
>>
>> Much better, then, to focus on the technology that today's - and
>> tomorrow's - games do support, and make it work faster. That means shader
>> 2.0.
>>
>> The R420 will deliver that through its eight rendering pipelines containing
>> an unknown number of texture units and six vertex units. The 160 million
>> transistor chip will be fabbed at 130nm by TSMC. It will support DDR, GDDR 2
>> and GDDR 3 across a 256-bit interface. It is expected to be used in AGP 8x
>> boards.
>>
>> Of course, Nvidia will tout shader 3.0 support when it launches the
>> long-awaited NV40 later this year. ®
>>
>>
>
>Not that there isn't a hope in hell it shall be able to run at more than
>5FPS on the nVidia 6800U, in a FPS game. Yes it has PS3 support, but it
>runs like a dog, something they didn't tell you :)


Two words that describe your reaction to the 6800:-

Sour grapes !

Hope you didn't buy a 9800XT/256 meg. The extra 128 meg
is practically useless anyway with a card that slow - no way
it will run at resolutions requiring 256Meg without being a slide-
show.

John Lewis
 

Andrew

Distinguished
Mar 31, 2004
2,439
0
19,780
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

On Sun, 18 Apr 2004 05:59:37 GMT, john.dsl@verizon.net (John Lewis)
wrote:

>Hope you didn't buy a 9800XT/256 meg. The extra 128 meg
>is practically useless anyway with a card that slow - no way
>it will run at resolutions requiring 256Meg without being a slide-
>show.

Resolutions don't require 256MB, large textures do, and a 9800XT
handles them very well.
--
Andrew. To email unscramble nrc@gurjevgrzrboivbhf.pbz & remove spamtrap.
Help make Usenet a better place: English is read downwards,
please don't top post. Trim messages to quote only relevant text.
Check groups.google.com before asking a question.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

> Resolutions don't require 256MB, large textures do, and a 9800XT
> handles them very well.

Actually, both do. The more ram, the higher the resolution you can go with
AA or AF enabled.
A crude example: if you have a 128mb card, running a game with normal
textures at 800x600 will require say 4mb for framebuffer, add another 8-16mb
for AA or AF. This leaves about ~100mb for textures. However, the higher the
res., the larger is framebuffer (even bigger if trippel buffer is enabled)
thus leaving less memory for the textures.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

Ati's decision will indeed hurt some sales, depending on how smart is
customer, and how well nvidia publicize their card.
The fact is that life span for any gaming card is like 18-24 months, 30
months at the top!

So lets imagine 8500 that came out about 4 years ago, it can still play new
games quite nicely. And still today there are MOST games don't use Shaders.

DX9 games that uses ps2 came out on market this year, so for almost 3 years
there was no real benefit of having DX9 support. Also note that radeon9700
came out 3 years ago. So it had to wait 2 years to get any real dx9 enabled
game.

While most people bought this card cause it was faster/smoother in DX8 and
DX7 games.
 

Les

Distinguished
Jan 25, 2001
710
0
18,980
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

"Asestar" <a s e s t a r @ s t a r t . n o> wrote in message
news:Q4ugc.79925$BD3.9345563@juliett.dax.net...
> Ati's decision will indeed hurt some sales, depending on how smart is
> customer, and how well nvidia publicize their card.
> The fact is that life span for any gaming card is like 18-24 months, 30
> months at the top!
>
> So lets imagine 8500 that came out about 4 years ago, it can still play
new
> games quite nicely. And still today there are MOST games don't use
Shaders.
>

Aren't there a good few new games requiring shaders?

> DX9 games that uses ps2 came out on market this year, so for almost 3
years
> there was no real benefit of having DX9 support. Also note that radeon9700
> came out 3 years ago. So it had to wait 2 years to get any real dx9
enabled
> game.
>

July 2002 is not 3 years ago.

> While most people bought this card cause it was faster/smoother in DX8 and
> DX7 games.
>
>
 

minotaur

Distinguished
Mar 31, 2004
135
0
18,680
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

John Lewis wrote:
> On Sat, 17 Apr 2004 14:28:07 +1000, Minotaur <antnel@hotmail.com>
> wrote:
>
>
>>wired and confused wrote:
>>
>>>http://www.theregister.com/2004/03/11/ati_drops_pixel_vertex_shader/
>>>ATI 'drops pixel, vertex shader 3.0 support' from R420
>>>By Tony Smith
>>>Published Thursday 11th March 2004 09:59 GMT
>>>ATI's upcoming R420 graphics chip will not support DirectX 9's version three
>>>pixel and vertex shaders.
>>>
>>>So claims German web site 3D Center, saying that the absence is "beyond
>>>doubt".
>>>
>>>The site argues that since the R420 is derived from the older, proven R300
>>>architecture, it would never have been easy to 'bolt on' pixel and vertex
>>>shader 3.0 support, so ATI instead decided to focus on improving shader 2.0
>>>support. Indeed, it concludes that ATI even believes shader 3.0 support
>>>isn't as important as some gamers and other graphics chip fans might think.
>>>Shader 3.0 is "a beautiful, but rather useless check list feature", the site
>>>says.
>>>
>>>Essentially, shader 3.0 support won't be necessary until the next generation
>>>of graphics chips arrives with the upcoming 'Longhorn' version of Windows in
>>>mind. By then there should be much better shader 3.0 support in games, too -
>>>there aren't any yet.
>>>
>>>Much better, then, to focus on the technology that today's - and
>>>tomorrow's - games do support, and make it work faster. That means shader
>>>2.0.
>>>
>>>The R420 will deliver that through its eight rendering pipelines containing
>>>an unknown number of texture units and six vertex units. The 160 million
>>>transistor chip will be fabbed at 130nm by TSMC. It will support DDR, GDDR 2
>>>and GDDR 3 across a 256-bit interface. It is expected to be used in AGP 8x
>>>boards.
>>>
>>>Of course, Nvidia will tout shader 3.0 support when it launches the
>>>long-awaited NV40 later this year. ®
>>>
>>>
>>
>>Not that there isn't a hope in hell it shall be able to run at more than
>>5FPS on the nVidia 6800U, in a FPS game. Yes it has PS3 support, but it
>>runs like a dog, something they didn't tell you :)
>
>
>
> Two words that describe your reaction to the 6800:-
>
> Sour grapes !
>
> Hope you didn't buy a 9800XT/256 meg. The extra 128 meg
> is practically useless anyway with a card that slow - no way
> it will run at resolutions requiring 256Meg without being a slide-
> show.
>
> John Lewis
>


LOL, no I still have my 9700Pro *8) quiet happy with 50-70FPS+ in
Battlefield Vietnam, with everything on HIGH @ 1600X1200X32.

No Sour Grapes, because nothing uses PS3 yet in a big way!
Farcry has a patch for PS3, but from reviews, it does nothing to the
performance or quality of the game.

Unfortunatly for you PS3 fanboys, PS3 won't be in wide use for a few
years from now! Reason I am keeping this 9700Pro for now, why swap?
when there is nothing new to take advantage of, on a newer card, besides
just extra raw speed..


Minotaur *8)