The History Of Nvidia GPUs (Archive)

Status
Not open for further replies.

dstarr3

Distinguished
Man, I remember picking up a 6600 with my high school grad money. I loved that thing. An amazing upgrade from my Viper V770, lol. Doom 3, FEAR, Half-Life 2, that thing let me play some real good games.
And I ended up using that card all the way until just a few years ago when I replaced the whole computer with a new build equipped with a GTX770. And lemme tell you, that 6600 was a great card when it came out, but after seven or eight years, boy had that performance been surpassed a thousand times over. Super Meat Boy at 640x480, about 20fps. lol

Also, for the record, Viper V770, nVidia 6600, GTX770, my current 980 Ti, all great cards.
 
Looking at this, made me feel like such an nVidia tool. Had a Riva TNT, a TNT 2, a GeForce 256, a GeForce 2, a Geforce 4200Ti, a GeForce 6600GT, a GeForce 8800GTS and finally a GeForce GTX260 (that's EIGHT nVidia GPUs!). Switched to AMD for an HD7950 3GB (vs GTX 770 2GB as nVidia's equivalent) and now have an R9 390 8GB (vs nVidia's GTX970 3.5GB), purely as value based decisions - you just got more GPU for the same or less money with AMD. No regrets leaving Team Green (for now) to be honest, but jebus, reading this article it hit me how many nVidia cards I've owned over the years.
 
Literally, the only nVidia graphics card that I've used in a personal system was the GT 430 like five years ago because I was tight for money.

I've gone team red recently because performance/dollar in DX12 and Vulkan.

Not that there's anything wrong with nVidia cards. I'm not trying to start a fanboys argument here.
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160
I started out team "whatever they gave you" from Trash 80's to 8bit Apple/Atari/Commodore. I then took chances on Voodoo/Ati until the early 2000's. I've been team green since the Gtx200's and although the release schedules annoy me, the support usually doesn't. My latest 1080's are keeping me very happy! Love the article and look forward to enjoying the team red version(who I also wish the best for).
 

dstarr3

Distinguished


A polygon can be thought of as a simple 2D shape that, when assembling hundreds or thousands of these simple 2D shapes, forms a detailed 3D object. The most efficient polygon to use would be a triangle, because a triangle is made up of three points, and there is no way to arrange three points in 3D space in such a way that they exist on more than one plane. Three discrete points can only exist on one plane, whereas four or more can exist in multiple planes. Which is to say, any polygon more complex than a triangle can be expressed more simply as two or more triangles. Therefore, triangles became the standard polygon in 3D modelling.

Here's a very detailed and very good explanation on the subject: https://www.youtube.com/watch?v=KdyvizaygyY
 

zMeul

Commendable
Sep 18, 2016
1
0
1,510
using the term "GPU" to describe a video card is just wrong!!

GPU - a single chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second.

http://www.nvidia.com/object/gpu.html
 

Karadjgne

Titan
Ambassador
I had a Voodoo3 3000 for years, loved that card, even if it was AGP. ATi X800, XFX 8800GT 512, Asus 660ti still running in wifes pc, Asus Strix 970 in my pc. Definitely get my moneys worth out of a gpu.
 
All that history yet no mention of when nVidia bought out 3dFX and introduced the first SLI GPU's. I would think that would be a significant step for the company.

I expect an ATI/AMD article to come soon and I hope they mention when they began Crossfire and how they came up with it. I'm hoping it wasn't just reverse engineering 3dFX's SLI.

My GPU experiences have been somewhat limited, but here it is:
1mb Western Digital Paradise video card, Radeon 9250 128mb, Radeon X1650 Pro 512mb, Radeon 3850 512mb(my last APG card), GeForce 8600M GT 256mb, Radeon 6870 1gb, Crossifre 6870's, Radeon R9 390 8gb.

Although I've had mostly ATI/AMD cards, I'm not loyal to either brand. The early cards from the 3850 and earlier, I didn't really know much about what I was buying, but they were the right price and in stock when I went to get a card. After those, I learned and bought what was the best bang for my buck.
 

nutjob2

Reputable
Aug 31, 2015
41
0
4,540


The article is wrong. The NV1 made the boneheaded decision to use quadrilaterals but all other cards (since) use the triangle as the graphical (shape) primitive becuase it is the simplest polygon. You can use whatever construct you wish (polygons, meshes, etc) but they all get boiled down to triangles for final processing and rendering.
 

nutjob2

Reputable
Aug 31, 2015
41
0
4,540


Please stop listening to Nvidia's corporate BS.There is no such "technical definition" of a GPU. Funny that they put it in quotes but don't include a reference. That's becuase they made it up themselves, of course.
 

Achoo22

Distinguished
Aug 23, 2011
350
2
18,780
The most efficient polygon to use would be a triangle, because a triangle is made up of three points, and there is no way to arrange three points in 3D space in such a way that they exist on more than one plane.
Since a triangle will never be convex, it is also dramatically simpler to test for intersection with a point or ray (an important function in graphics) than it would be for a general-purpose polygon.
 

Achoo22

Distinguished
Aug 23, 2011
350
2
18,780
You can use whatever construct you wish (polygons, meshes, etc) but they all get boiled down to triangles for final processing and rendering.
Having written rendering engines that use constructive solid geometry, splines, nurbs, etc., I believe your statement is nonsense. Even if you limit the scope of discussion to rasterizing systems, your statement remains false.
 

bit_user

Polypheme
Ambassador
I thought the NV1 was covered sufficient well. It was a pretty bold move, on Nvidia's part, to go for quadratic surfaces. However, I'd guess they failed to predict the impact of Direct 3D and its favor towards the lowest common denominator.

I also thought it was odd, because the graphics literature was full of bi-cubic patches and NURBS (non-uniform rational b-splines) surfaces. So, their decision to go with quadratics seemed doubly-odd. I assume it had to do with hardware implementation complexity and performance. Technically, I think it was a good choice. But, you need the game developers to come along with you...

However, I couldn't let this pass:
Needing a separate 2D video card was common in the 1990s
Can anyone tell me another card, besides the first two generations of 3dfx, that was 3D-only? This was not "common".

Also, I wish for separate board shots and screen caps. Or at least split-screen or picture-in-picture. The cross fades make it rather hard to appreciate either.
 

bit_user

Polypheme
Ambassador
While that definition is rather arbitrary, I agree that the term GPU certainly wasn't popular, in the 90's. You'd talk about "video chipsets", with some companies making chipsets (S3, Tseng, Cirrus Logic, etc.) for use in 3rd party boards (Diamond, Paradise, etc.), while a few (old ATI, Matrox, Number9) used proprietary chips for their cards.

According to Wikipedia, Nvidia popularized the term in 1999.

I never bought Nvidia until the GTX 980 Ti. Starting with Maxwell, it seems Nvidia got a lot more efficient at extracting usable performance from their hardware. I recently compared specs vs. performance of some previous & the current generation GPUs, and notice an unmistakable trend of Nvidia getting better real-world performance than the specs would suggest.

If you only look at memory bandwidth and compute, you'd think RX 480 should compete with the GTX 1070. But, instead, it struggles against the GTX 1060. I wish I knew what was behind this. Certainly, Maxwell is a more efficient architecture than Kepler (which is the main reason for the performance difference), but why they hold this advantage over AMD is a mystery to me.
 

80-watt Hamster

Honorable
Oct 9, 2014
238
18
10,715


The article seemed to intentionally focus on the flagship chip of each generation. Understandable, as going into the sub-versions would produce an article that resembles a novel. What I found more interesting was that the GTX 750, as the first desktop Maxwell part, wasn't mentioned at all.
 

bit_user

Polypheme
Ambassador
Pixel Shader 2.0A featured a number of improvements ... instruction prediction hardware, and support for more advanced gradient effects.
Should be "instruction predication and gradient instructions".

Predication allows individual instructions to be made conditional. Gradient instructions simply improve efficiency of algorithms involving the gradient operator (fairly common, in graphics code).


Also, I was kinda sad to see the years of each GPU's introduction get dropped (until near the end). I really liked that, about the first 9 entries. I had to resort to opening this, in a separate window:

https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units

I also appreciated when the competing ATI/AMD cards were mentioned. It helps put these in context.
 

none12345

Distinguished
Apr 27, 2013
431
2
18,785
Trip down memory lane for me. TNT, TNT2, geforce2, geforce4, 6600gt....

The 6600gt was the end of the nivida road for me tho. The drivers were absolute crap. You could find a driver that worked fine with 1 game, but wouldn't work for a different game. Then you could find a driver that worked with the 2nd game but not the first. And then neither would work with another piece of software, then you would find a 3rd driver that worked with that but didn't work either either game, etc.

So when it came time to upgrade, i considered switching to ATI for the first time. Considering the home run they had with the 4800 series, and the driver isues, switching to a 4850 was a no brainer.

That 4850 lasted way too long. It was playing every game i bought just fine through 2014 to the start of 2015, then i started hitting dx11 only games that it couldnt play at all. Had to finally replace it last year because it finally died. I was going to replace it anyway when pascal/polaris came out, but it died 7 months before that happened. So it got replaced with a 380.

With that pascal and polaris are out, not buying another card this soon. So, I'm in wait mode for vega/volta to see if either are worth of a ~$400 upgrade.

I'm open to nvidia again, but they have to be the clear choice in performance/$ for me to switch back. If its a tie +/-10% then amd gets my business. I dont like some of the nvidia business practices in the last few years, so nvidia has to be a clear winner for me to give them my business again; say 25% more performance/$ and ill switch back without hesitation. Performance/watt doesn't mean crap to me tho.
 

dstarr3

Distinguished


Yeah, I remember having driver problems on my 6600LE, too. Although it wasn't that bad. Once the drivers started rolling out for the 9000 series, those drivers completely broke my computer, so I rolled back to the last working driver and I was fine for the rest of the life for that card. But I was stupid and never saved the driver installation file anywhere, so whenever I did a clean OS install, I had to spend an afternoon going through old driver repositories finding the right one again.
 
Heh - funny how the TnT entry seems to take a word or two from my comment back when the article was first published. Still, while not incorrect, the most common version of the TnT shipped with 8MB of SGRAM. While it had half the memory, the total throughput of SGRAM (it allowed simultaneous read and write) made it faster in several games. On the other hand, it was trivial to overclock the SDRAM from 110 to 125 MHz (as pretty much all models of the card came with 125 MHz-binned chips that were downclocked) which provided almost linear performance improvements.

And yes, the Detonator drivers did unleash a lot of performance from the card. With Detonator, enabling sideband addressing on the AGP bus and O/C'ing the SDRAM, the average framerate went from 11.5 to 28 fps on the Unreal castle fly-by. However, the Detonator drivers were originally intended to allow Nvidia to add a 'Ti' on their Geforce2 line; it allowed them to sell the same old cards and claim an X2 performance when all you needed was a driver update.
 
Status
Not open for further replies.

TRENDING THREADS