High-Resolution Die Imagery Of Nvidia GPUs

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

9Nails

Distinguished
Apr 13, 2001
21
0
18,510
Do you still have a 8800 GTX in use?

8800 GTX owner right here! Still in use and it's running every game that I throw at it. I have yet to find a reason to replace this wonderful chip.
 
G

Guest

Guest
Nvidia had nothing to show for the last 3 quarters. Fermi is desperately late. On top of what will practically be a paper launch on March 26, it will be multiple months to reach any significant volume. Would you really call a "hard launch" under 10 0000 units that will sold out in a matter of hours? I thought so, me either.

So what do you do to fill the absolute marketing void between now and then? A little historical revisionism article under the cover of a cute family photo die shot, of course!

Lets consider those suspicious omissions:

-Nvidia first and failed attempt: the NV1 STG-2000. (The ACTUAL first 2D/3D integrated solution, NOT the TNT2.)
-The stillborn NV2.
-The less than stellar NV3 Riva 128.
-The infamous NV30 GeForce FX "dust buster".

Honorable mentions:

-Hostile takeover of 3DFX, scavenging SLI from its still warm corpse.
-Epic fail "puppy" Fermi with wood screws, hacksawed board, glued connectors! A Tesla mock-up but with REAL benchmark numbers backed-up by... a video! BTW, the GF100 die shot is already available on-line.

Nvidia didn't provide a die shot for 30% of its own technology? How convenient! And it just so happened they are all FAILURE? LMAO! How about showing us the GPU picture instead and then tell us you will update with proper die picture when they show up?

Oh, I was about to forget: I still have a working TNT2, Kyro II, Rage 128, 9700 pro, currently use a Geforce 6800GS and about to switch for an ATI 5770. As you can see, no fanboy here: Best bang for the buck only. I don't give a damn about stupid branding. If Matrox or S3 would miraculously come over the top with the best card, I would instantly dump both ATI and Nividia.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
Cool slideshow, though I will say, I'm a tad disappointed at the quality of images that nVidia has kept; seems that a lot aren't as high-contrast as you see in most CPU shots (as well as some of ATI/AMD's GPU shots) and I have my doubts that the grid pattern on some of the earlier ones is actually part of the GPU. Still, it all looks nice; especially once we get to the G80/G92/GT200 dies.

And I also agree; having a reference to the fabrication process would've been nice as well. For those who can't find them, I'm providing what specs I have on the GPUs themselves that isn't listed. I'm missing the die area for a number of the GPUs, which is unfortunate. Hard to find that tidbit of data a lot.

* NV4 - 8 million transistors, 350nm process.
* NV5 - 15 million transistors, 250nm process.
* NV10 - 22 million transistors, 220nm process.
* NV11 - 19 million transistors, 180nm process.
* NV15 - 25 million transistors, 180nm process.
* NV17 - 29 million transistors, 150nm process.
* NV20 - 57 million transistors, 150nm process.
* NV25 - 63 million transistors, 150nm process.
* NV40 - 222 million transistors, 130nm process.
* NV43 - 143 million transistors, 130nm process, 150 mm² die area. (mistakenly labeled in-article as NV46)
* G70 - 302 million transistors, 110nm process, 334 mm² die area.
* G80 - 681 million transistors, 90nm process, 484 mm² die area.
* G92 - 754 million transistors, 65nm process, 324 mm² die area.
* GT200 - 1,400 million transistors, 65nm process, 576 mm² die area.

Interesting to note is the upward slope in die area between G70, G80, and GT200. Of course, this can be seen as a direct consequence of more than doubling the transistor count each step, while not advancing more than a full fabrication node.
 
My all-time favourite? Well, I must admit I kept my RivaTnT a looong time. That chip ROCKED! I must say though, I was using a very special one: Asus's implementation, which added (through a VBIOS flash) AGP 2X, Sideband addressing and some further optimizations - such as 16 Mb of 125 MHz SDRAM, that were factory clocked at 110 but could be raised to its 'proper' frequency with a software patch. The card could also handle high AGP bus speeds (it worked at 83 MHz instead of 66).

Well, at first it didn't. To be fair, when I first got it I had it in duplex with a 3dfx Voodoo.

But then came the Detonator driver series (you know, the ones used by Nvidia to sell Geforce2 'ti' chips, that were identical to the non-'ti' ones, at a premium...) - and suddenly, the RivaTnT doubled power! With the above patches, I went from 11 fps average (stock, original driver) to 28 fps on Unreal (the story mode FPS), patch level 26, start benchmark (the fly-by with the castle).

I could even play Max Payne in 800x600 in med-to-high details levels. Of course, it missed shaders (transform and lightning were early forms of shaders), but it worked.

I switched to a Geforce Ti 4200/8X when the RivaTnT couldn't handle GTA3 (well, it could, but only in 640x480 with the lowest settings available, and the fps was too low to make the game enjoyable), which I kept for quite some time - until I got a PCI Express mobo, actually, which I fitted with a Geforce 6600 (not GT) passively cooled.

When the 6600 got too slow, I switched brand: the Radeon HD4850 had come out. And Nvidia had nothing approaching that card (which is still a THG recommendation to this day - I like it when I buy a piece of hardware and it stays a recommended piece of stuff for a couple years) at the time.
 

juliom

Distinguished
Jul 21, 2009
84
0
18,630
I started with:
- Rendition Verité 2200 8 MB (sucked but could play Quake 3 medium at 800x600!)
- Matrox G400 MAX Dual Head 32 MB (loved it and its dual VGA output, as fast as a TNT2)
- ATI Radeon 9200SE 128 MB (overclocked and still works perfectly)
- ATI 9600XT 256MB (sold it to friend, still works)
- GeForce 7800 GTX 512MB (awesome, higher clocks on the GPU and memory than the 256MB version)
- ATI 2900 XT 512 MB DDR3 (broken cooler, used the PCB to fix another 2900XT from a friend with a good cooler and broken PCB :p)
- ATI 4870 512 MB (current card, gonna use it until the ATI 6000 series comes out)

Fond memories...
 

Xenophage

Distinguished
Mar 6, 2009
74
3
18,635
I owned a card with the FIRST nVidia chipset, the NV1. The card was the Diamond Edge 3D, and it was around even before the 3dfx Voodoo. As I recall it used a proprietary 3D API and the card shipped with Virtua Fighter and some motorcycle racing game.

nVidia's NEXT chip was the Riva 128, which I also owned, and was the first nVidia card to support Direct 3D and Open GL.

Skipped the TNT, but I did have a TNT 2 Ultra. Loved that one. I took a brief hiatus from nVidia with the ATI Rage 128.

My next two nVidia cards were a GeForce 128 followed by a GeForce 3, both Asus if I recall correctly.

In my opinion the GeForce 1 and 3 were the most revolutionary designs by nVidia. I fondly remember the first time I loaded up Morrowind and Max Payne on the GeForce 3.

My next card was a Radeon 9700 Pro and I've been using ATI ever since.
 
G

Guest

Guest
running two 8800 gts 320mb in SLI.... very overclocked. still runs those new title well.
 

rickzor

Distinguished
Feb 11, 2007
506
0
18,990
I still have a geforce 3ti200 128mb oced and a ti500 voldmodded still kicking!

I'm impressed how far these cards have gone, not a single artifact yet!
 

pcchip

Distinguished
Jan 22, 2003
14
0
18,510
My favorite was what was advertised as "The First GPU!"

I remember paying $300 for the GeForce256 , the original GeForce - and feeling like I was the coolest kid on the block because Team Fortress (original) ran faster!
 
G

Guest

Guest
lol... I was actually thinking about the whole lineup of gfx cards I had since 1999... it's quite a coincident seeing this article on toms.... sporting a Gigabyte GTX 285 atm..... runs like a charm...
 

taintsauce

Distinguished
Apr 17, 2009
21
0
18,510
7950GT for the win. The first and only top-end card I've purchased, and at the time was well worth the cash.

The GeForce 2 brings back some memories, though. Our second family computer rocked one of those. Quite the upgrade from the 4 meg ATI Rage in the previous unit - I almost shat myself when I discovered I could actually run then-modern games. Almost shat again when I went searching for memory upgrades a couple years later, as the bloody thing used RAMBUS.
 

colonelc

Distinguished
Mar 3, 2010
1
0
18,510
The 3DFX Banshee. Yeah, I know this article is mainly about Nvidia, but I think this board was one of the first to really suggest where graphics cards and chips were headed. It was the first real "Wow!" since I got my first computer in 1986. (An Amiga 1000) What amazed me most about this particular card was how long I was able to enjoy using it until if finally couldn't do the job any longer. That lead to my getting a Gforce 4 and then and now a Gforce 8600GT. I suppose soon I'll have to climb back to the top of the heap again.
 
Status
Not open for further replies.