NV40 to have 222 million transistors

G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati (More info?)

"NVIDIA NV40 to have 222 million transistors"
Posted on Saturday, April 10 2004 @ 18:33:52 CEST by LSDsmurf

Next week NVIDIA is going to launch its NV40. According to the Inq
this chip has 222 million transistors. The NV40 is produced by IBM on
a 130nm architecture. It will use the 60 series of drivers, currently
60.70.
As we said before Nvidia will push hard the fact that it is the only
firm that will have a Pixel Shader 3.0 model on the market and it will
focus its marketing efforts there. This means that apart from full
support for Shader Model 3.0 it will support model 3.0 Vertex Texture
Fetch/Long programs/Pixel Shader flow control Vertex Texture
Fetch/Long programs/Pixel Shader flow control and full speed fp32
shading. Let's hope everyone understands what this means.

There are plenty of other features of the chip which can do 16 pixels
per clock Color & Z or 32 pixels per clock Z-only, 64-bit FP Frame
Buffer Blending & Display, Lossless Color & Z-Compression and a new
Antialiasing approach called High Quality AA - Rotated Grid full MTR
(multi target rendering I guess), and accelerated shadow rendering.

Geforce 6800 Ultra has two power connectors and is, surprisingly a one
slot card for its reference card. It uses GDDR3 memory clocked at
550MHz but some partners might go even higher.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati (More info?)

And your point is?
"NV55" <nvidianv55@mail.com> wrote in message
news:1c4cde47.0404101751.2ddd8090@posting.google.com...
> "NVIDIA NV40 to have 222 million transistors"
> Posted on Saturday, April 10 2004 @ 18:33:52 CEST by LSDsmurf
>
> Next week NVIDIA is going to launch its NV40. According to the Inq
> this chip has 222 million transistors. The NV40 is produced by IBM on
> a 130nm architecture. It will use the 60 series of drivers, currently
> 60.70.
> As we said before Nvidia will push hard the fact that it is the only
> firm that will have a Pixel Shader 3.0 model on the market and it will
> focus its marketing efforts there. This means that apart from full
> support for Shader Model 3.0 it will support model 3.0 Vertex Texture
> Fetch/Long programs/Pixel Shader flow control Vertex Texture
> Fetch/Long programs/Pixel Shader flow control and full speed fp32
> shading. Let's hope everyone understands what this means.
>
> There are plenty of other features of the chip which can do 16 pixels
> per clock Color & Z or 32 pixels per clock Z-only, 64-bit FP Frame
> Buffer Blending & Display, Lossless Color & Z-Compression and a new
> Antialiasing approach called High Quality AA - Rotated Grid full MTR
> (multi target rendering I guess), and accelerated shadow rendering.
>
> Geforce 6800 Ultra has two power connectors and is, surprisingly a one
> slot card for its reference card. It uses GDDR3 memory clocked at
> 550MHz but some partners might go even higher.
 

Andrew

Distinguished
Mar 31, 2004
2,439
0
19,780
Archived from groups: alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati (More info?)

On 10 Apr 2004 18:51:16 -0700, nvidianv55@mail.com (NV55) wrote:

>"NVIDIA NV40 to have 222 million transistors"

Ooh, transistor envy.
--
Andrew. To email unscramble nrc@gurjevgrzrboivbhf.pbz & remove spamtrap.
Help make Usenet a better place: English is read downwards,
please don't top post. Trim messages to quote only relevent text.
Check groups.google.com before asking a question.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati (More info?)

> Antialiasing approach called High Quality AA - Rotated Grid full MTR
> (multi target rendering I guess), and accelerated shadow rendering.

Wow, only took them 4 years to catch up to the Voodoo5 in terms of AA
quality.

Of course, the Voodoo5 wasn't exactly playable in most games using 4xFSAA -
but it sure did look nice.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati (More info?)

"Darkfalz" <darkfalz@microsoft.com> wrote in
news:c5c21o$1skf$1@ID-108208.news.uni-berlin.de:

>> Antialiasing approach called High Quality AA - Rotated Grid full MTR
>> (multi target rendering I guess), and accelerated shadow rendering.
>
> Wow, only took them 4 years to catch up to the Voodoo5 in terms of AA
> quality.
>
> Of course, the Voodoo5 wasn't exactly playable in most games using
> 4xFSAA - but it sure did look nice.

I agree. I was also surprised that it's taken them this long to reach the
same level of FSAA that the V5 had way back when. Well, at least they've
finally done it, sort of. We still have to wait for the actual cards to
show up on shelves.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati (More info?)

"Kentucky77" <We Are Borg> wrote:

>(snip)

*plonk*
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati (More info?)

Oh, by the way, depending on the definition of 'best', you might look into
ATI or nVidia workstation class cards like Quadro line from nVidia. The
prices are modestly cheap, few thousand bucks for a 3D card never ruined
anyone's balance. Then you will go into bed with your 3D card and make sweet
love with it (your bed is by the fireplace, right?).
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati (More info?)

"Darkfalz" <darkfalz@microsoft.com> wrote in message
news:c5c21o$1skf$1@ID-108208.news.uni-berlin.de...
> > Antialiasing approach called High Quality AA - Rotated Grid full MTR
> > (multi target rendering I guess), and accelerated shadow rendering.
>
> Wow, only took them 4 years to catch up to the Voodoo5 in terms of AA
> quality.
>
> Of course, the Voodoo5 wasn't exactly playable in most games using
4xFSAA -
> but it sure did look nice.

You didn't ever have a V5 did you? Games were perfectly playable - and
beautiful - at 4xFSAA.

Of course, you could get away with playing them at 640x480 (made it look
like 1024x768 I kid you not!)

Those were the days...

Neil
(Hated Nvidia for what they did to 3dfx, then "upgraded" to a GF2 then
finally was able to jump from evil Nvidia when Radeon's became a recognised
force!)
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati (More info?)

"Tony DiMarzio" <djtone81@hotmail.com> wrote in message
news:JK2dnV8itt44hufdRVn-sw@comcast.com...
> Thank you for that. Was unaware of R3xx's AA implementation type. However,
> I'd argue that RGSS, while much more demanding of the GPU, still produces
> better edge and texture quality than RGMS. Then again you can always
combine
> AF with AA on current hardware to achieve similar effects.

I agree, but even 3dfx was moving away from super-sampling; Rampage was
going to support only multi-sampling for its AA modes.

If you read nVidia's developer pdf on AA, super-sampling is the only way to
consistently AA DX9 games (depending on how the scene is being rendered).

John
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati (More info?)

"[neil]" <neilw@legend.co.uk> wrote in message
news:c5gtjh$1j6rn$1@ID-225607.news.uni-berlin.de...
> "Darkfalz" <darkfalz@microsoft.com> wrote in message
> news:c5c21o$1skf$1@ID-108208.news.uni-berlin.de...
> > > Antialiasing approach called High Quality AA - Rotated Grid full MTR
> > > (multi target rendering I guess), and accelerated shadow rendering.
> >
> > Wow, only took them 4 years to catch up to the Voodoo5 in terms of AA
> > quality.
> >
> > Of course, the Voodoo5 wasn't exactly playable in most games using
> 4xFSAA -
> > but it sure did look nice.
>
> You didn't ever have a V5 did you? Games were perfectly playable - and
> beautiful - at 4xFSAA.

I have one now... and still use that system on a daily basis.

I don't consider 20-30 fps in a FPS "playable", and it also made the
controls rather laggy. 2xFSAA was rather playable for Quake 3 and UT, but I
still prefer framerate in FPS.

That being said, 4xFSAA was great for slower moving games like Ultima IX and
is still superb for PSX and N64 emulators.

> Of course, you could get away with playing them at 640x480 (made it look
> like 1024x768 I kid you not!)

Nah it didn't, mostly because of the mip mapping is so much blurrier at
lower resolutions. But it did look good.
I will take 1024x768 noAA over 640x480 4xFSAA any day.

> Those were the days...
>
> Neil
> (Hated Nvidia for what they did to 3dfx, then "upgraded" to a GF2 then
> finally was able to jump from evil Nvidia when Radeon's became a
recognised
> force!)

I got a 5200 and quickly swapped it for a 5700. Quite happy to be on the
NVIDIA bandwagon now. I consider them the "standard", much like Intel.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

Nvidia killed off 3dFX? How? 3dFX used to be the leader in the field. I had
thought 3dFX killed itself when it decided no one needed 32-bit color (3dFX
cards could only do 16-bit color). I remember my 3dFX Voodoo 3/2000 PCI. Quake
2 and Half Life came alive w/that card (I'd been playing in software mode up
'til then).

>Neil
>(Hated Nvidia for what they did to 3dfx, then "upgraded" to a GF2 then
>finally was able to jump from evil Nvidia when Radeon's became a recognised
>force!)


-Bill (remove "botizer" to reply via email)
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati (More info?)

Darkfalz wrote:
> Typical AMD fanboy "ashamed of my inferior CPU" denial.
>
> Look, AMD is fine if it's all you can afford. But people who realise
> that you get what you pay for and want quality choose Intel.

It's ignorance like this about technology that put Microsoft in the position
it's in. While I could post link to shootouts between Intel's latest and
AMD's latest, I won't even bother, since you can't be bothered to be
educated about this subject before you speak about it.
 

Tip

Distinguished
Apr 15, 2004
42
0
18,530
Archived from groups: alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati (More info?)

> Typical AMD fanboy "ashamed of my inferior CPU" denial.
>
> Look, AMD is fine if it's all you can afford. But people who realise that
> you get what you pay for and want quality choose Intel.
>

Hmm, shame that Intel has to copy AMD just to catch up. LOL

http://www.theinquirer.net/?article=15029

--

Tip

http://gotips.net/
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.hardware.video,alt.comp.periphs.videocards.ati (More info?)

I have both Intel and AMD and I can say I have been greatly impressed my AMD. I used to be Intel all
the way. But you definitely get more for your dollar with AMD than Intel. No problems of any kind
and I have had 4 generations of AMDs by now. This person simply cant accept that he may be wrong by
being a slave to Intel. Not saying Intel processors are bad. Because all AMD is doing is offering
greater value to get you to switch. If they ever become the dominate the same could happen to them
as is happening to Intel.

You get what you payed for - screwed~

Only fools believe you ALWAYS get what you pay for.

On Thu, 15 Apr 2004 12:08:17 -0500, "The Mighty MF" <sether01@hotmail.com> wrote:

>Darkfalz wrote:
>> Typical AMD fanboy "ashamed of my inferior CPU" denial.
>>
>> Look, AMD is fine if it's all you can afford. But people who realise
>> that you get what you pay for and want quality choose Intel.
>
>It's ignorance like this about technology that put Microsoft in the position
>it's in. While I could post link to shootouts between Intel's latest and
>AMD's latest, I won't even bother, since you can't be bothered to be
>educated about this subject before you speak about it.
>
>
>