News Nvidia GeForce 256 celebrates its 25th birthday — company talks about a quarter century of GPU progression

My first graphics card (that wasn't S3 in a pre-built PC) was the ATI All-In-Wonder Pro 128 that competed with the GeForce 256. It was me and my brother's first PC build, using 1.2GHz AMD Thunderbird CPU in an MSI mobo. We picked that card for the connectivity and having no idea about graphics speeds or anything. It had 32MB of VRAM so how could it be any worse than the more expensive GeForce with the same memory, right? :)

Of course, now I know the GeForce model was quite a bit better at the time and have learned a bit more about GPUs.
 
My first graphics card wasn't a GPU -- ATI Radeon 7000 64MB SDR PCI. Finally got a "real GPU" years later, a Radeon 9600. I'm thinking a GeForce 6600 GT (actually twins for SLI) came after that, and on and on. Back then, upgrades were huge gen-on-gen and only took 12-18 months. Yes, top of the line cards were much more affordable back then than they have been in recent years, but they also grew in die size and shaders (using that generically) and therefore PCB size, cooler size (1-slot was the standard for some time, then 1.5 came, then 2-slot was wild, and so on). I'm sure AMD and nVidia have much larger development headcounts today.

Those were good days back when nVidia and ATI were trading blows.
 
That brings back memories. Instead of the huge press events Nvidia holds today, this was tucked in a little room at the Intel Developer forum in Palm Springs, CA. There were perhaps three dozen people in the room. They served cake and Jensen went into his presentation about why this wasn't just a new graphics card. It was quite engaging and yet I had no idea this was an inflection point for the field. They gave us long sleeve jerseys with a slogan about "today the world has changed..." Still have it stored away in a bin somewhere in the house.
 
First build was a 1.53ghz Athlon XP 1800+ with a GeForce3 Ti 200 in 2001. Upgraded to the GeForce4 Ti 4600 the following year.

Still have both the chameleon demo and the wolfman demo on my current build for nostalgia.
 
My first Nvidia GPU was a Creative Labs 256 Blaster, or something similar. Before that, I had cards from ATI, Matrox, S3 Virge, and 3DFX. I loved the competition back then, but even more, I enjoyed going to my local hardware store and checking out the graphics card boxes
 
My first GeForce, was 6800, back in 2005. I was a university student at the time and i bought it for the sole purpose of maxing out the graphics of Splinter Cell: Chaos Theory. Those were the days, man!
 
My first Nvidia card was the RIVA 128, replacing the ATI 3D RAGE with its buggy OpenGL driver. I was planning to replace it with the Rendition Vérité V1000, but the actual cards had too many months of delays. The RIVA 128 was pretty good and could actually run Softimage|3D.
 
My first graphics card wasn't a GPU -- ATI Radeon 7000 64MB SDR PCI. Finally got a "real GPU" years later, a Radeon 9600. I'm thinking a GeForce 6600 GT (actually twins for SLI) came after that, and on and on. Back then, upgrades were huge gen-on-gen and only took 12-18 months. Yes, top of the line cards were much more affordable back then than they have been in recent years, but they also grew in die size and shaders (using that generically) and therefore PCB size, cooler size (1-slot was the standard for some time, then 1.5 came, then 2-slot was wild, and so on). I'm sure AMD and nVidia have much larger development headcounts today.

Those were good days back when nVidia and ATI were trading blows.
Was that the Radeon VE, the budget version of the Radeon 7000 series released in 2001 and later renamed Radeon 7000 64 bit etc? Because the actual normal Radeon cards were already on the second generation GPU:s when they reached the 7000 series and did support hardware transform and ligthing. But that budget version didn't. It only had 32 Mb memory, not 64 MB (that the normal cards had), so it can't be the non T&L card. It did also only have 64 bit memory (maybe that is what you meant?)
 
They didn't support hardware transform and lighting and couldn't do all graphics computaton in the graphics card and therefore wasn't the first GPU:s.
That was a first for the Geforce 256. Everything done in the graphics card, hence a GPU.
what is this logic ? All graphics computations ? until today no GPU does "All graphics computations" ... for example , RTX was the first to introduce real time raytracing .. does this mean GTX is not a GPU ?
 
I remember buying the Guillemot (later Hercules) GeForce 256 DDR back in the day. I wonder whatever happened to it? I'm guessing I probably sold it to fuel my next upgrade.
 
  • Like
Reactions: valthuer
Definitely brings back memory. My first built was summer 2000. I was still in high school but somewhat due for a computer upgrade. My dad was kind enough to agree that if I got a job, he would match whatever I made during the job to boost my buying power.

I made ~$1300 CAD in about 6 weeks doing pretty harsh labor, and my dad more than delivered on his words, as I got a $2600 budget without actual need to spend my earnings. In retrospect it was probably a lesson to get me ready to live off my own means.

I spend a lot of time before and after to read up on best bang-for-the-buck build. The actual build featured a P3 550E oc'ed to 733 MHz. It was also a period were ram was crazily expensive: 128MB of SDR133 cost me ~$200 CAD. Within a year I was able to buy 256MB for less than $100. The highlights were definitely the Asus GeForce 256 DDR Delux with vivo, paired with a gorgeous ViewSonic PF 775.
 
Definitely brings back memory. My first built was summer 2000. I was still in high school but somewhat due for a computer upgrade. My dad was kind enough to agree that if I got a job, he would match whatever I made during the job to boost my buying power.

I made ~$1300 CAD in about 6 weeks doing pretty harsh labor, and my dad more than delivered on his words, as I got a $2600 budget without actual need to spend my earnings. In retrospect it was probably a lesson to get me ready to live off my own means.

I spend a lot of time before and after to read up on best bang-for-the-buck build. The actual build featured a P3 550E oc'ed to 733 MHz. It was also a period were ram was crazily expensive: 128MB of SDR133 cost me ~$200 CAD. Within a year I was able to buy 256MB for less than $100. The highlights were definitely the Asus GeForce 256 DDR Delux with vivo, paired with a gorgeous ViewSonic PF 775.
no sound card ?
 
no sound card ?
Of course there was. A run-o-the-mill SB Live! Value PCI version for ~$60 CAD. Nothing particularly note worthy.

Here's the full spec list IIRC:
  • PIII 550E Coppermine, OC'ed to 733Mhz
  • Asus i815 motherboard
  • Asus GeForce 256 DDR Delux
  • 128MB PC133 Geil Ram, later upgraded to 256MB CAS2 Micron
  • ViewSonic PF775 <-- thing is a beauty, better than most 17" CRT, including Sony ones
  • Creative SB Live! Value PCI sound card
  • Creative 40-16-4 CDRW *Edit* AOpen 52x CDrom, the CDRW was my 2003 build.
  • IBM Deskstar 75GXP - 45GB (all my Friends' Deskstar became Deathstar. Mine fortunately survived for 4 years)
  • 56k dial-up modem for which I forgot the brand
  • random assortment of case/psu/kb+mouse
 
Last edited:
what is this logic ? All graphics computations ? until today no GPU does "All graphics computations" ... for example , RTX was the first to introduce real time raytracing .. does this mean GTX is not a GPU ?
Hardware tranform and lighting was the key part done in the CPU before. That is central part of 3d that all graphic cards performed after this. That was the reason why the Geforce 256 was termed "GPU".
This is to differentiate the cards that didn't have T&L to those that had it. That was all that was done at the time in a graphics card in games. Raytracing was not on the map for decades.
So the card you mentioned didn't have this and therefore wasn't a GPU =a card where everything at the time that had to do with graphics rendering in a game that was used at the time was done on the graphics card.