New Voodoo3 3000 driver!

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Good riddance to 3dfx, bought a card a week before they went under, were good for a brief time in the 20th century LOL

Hey man, without 3DFX we may not have the level of advanced graphics that we do today.

Still got a Voodoo 3 2000 thats alive and kickn' arse. Played Half-Life and Quake3 like a jem.
 
Good riddance to 3dfx, bought a card a week before they went under, were good for a brief time in the 20th century LOL

Hey man, without 3DFX we may not have the level of advanced graphics that we do today.

Still got a Voodoo 3 2000 thats alive and kickn' arse. Played Half-Life and Quake3 like a jem.
Yeah, my 3500 rocked it up in Unreal Tournament like nobody's business man! 😀
 
I remember when a friend of mine bought a Voodoo 3.. I was Jealous. My first GPU was a Riva TNT2 Ultra. Bad a$$ back in the day.

Technically, GPU's didn't exist until the GeForce 256.
Don't you mean the TERM GPU didn't exist? The TNT2 was doing what a GPU does, but I think the terminology just wasn't coined yet.

It's like how Texas Instruments' processors in calculators COULD be considered CPUs because they are the Central Processing Unit of the Calculator, but of course when we use the term CPU we are usually referring to CPUs in Personal Computers.

Correct me if I'm wrong.
 
Damn, I was like 10 years old back then, but I seem to recall that the fact that the GeForce 256 had hardware T&L is what made it a GPU...
 
I remember when a friend of mine bought a Voodoo 3.. I was Jealous. My first GPU was a Riva TNT2 Ultra. Bad a$$ back in the day.

Technically, GPU's didn't exist until the GeForce 256.
Don't you mean the TERM GPU didn't exist? The TNT2 was doing what a GPU does, but I think the terminology just wasn't coined yet.

It's like how Texas Instruments' processors in calculators COULD be considered CPUs because they are the Central Processing Unit of the Calculator, but of course when we use the term CPU we are usually referring to CPUs in Personal Computers.

Correct me if I'm wrong.

Actually the introduction of the Geforce series had something to do with a programmable graphics chip. Something that hasn´t been around before and thus the term GPU was coined.
 
*sigh*

http://en.wikipedia.org/wiki/Graphics_processing_unit

Please read that.

A Graphics Processing Unit or GPU (also occasionally called Visual Processing Unit or VPU) is a dedicated graphics rendering device for a personal computer, workstation, or game console. Modern GPUs are very efficient at manipulating and displaying computer graphics, and their highly parallel structure makes them more effective than typical CPUs for a range of complex algorithms.

A GPU implements a number of graphics primitive operations in a way that makes running them much faster than drawing directly to the screen with the host CPU. The most common operations for early 2D computer graphics include the BitBLT operation (combines several bitmap patterns using a RasterOp), usually in special hardware called a "blitter", and operations for drawing rectangles, triangles, circles, and arcs. Modern GPUs also have support for 3D computer graphics, and typically include digital video-related functions as well.

Sealboy might have something with the Hardware T&L being a requirement for the GPU though...

The first company to develop the GPU is NVIDIA Inc. Its GeForce 256 GPU is capable of billions of calculations per second, can process a minimum of 10 million polygons per second, and has over 22 million transistors, compared to the 9 million found on the Pentium III. Its workstation version called the Quadro, designed for CAD applications, can process over 200 billion operations a second and deliver up to 17 million triangles per second.

But that doesn't seem to be entirely accurate to me, because Amiga had a dedicated chip for graphics processing, so why does that not qualify?

Anyway, bah. Whatever. It's a stupid terminology argument anyway so meh. 😀
 
Actually, I think the GeForce 3 was the first programmable GPU.

The term GPU was more of a marketing gimmick back then. I remember a lot of people refused to use it. But the GeForce was one of the first cards to integrate a lotta crap into one chip... including a memory controller.

Regardless, the kick ass performance didn't really come until the GeForce 2, which was released a few short months later.

And of course, the GeForce 3 was the ultimate card. That's when graphics really took a leap forward.
 
How would that fit in any computer case :S and what is that waste of board t the end that ooks like it has no pcd on it :S wierd design

Back when I was in high school, I use to 'acquire' old computer from people or businesses - take them apart, resuse the parts etc.
Some of the old ISA 14.4 or slower Data/fax modems rivaled the size of a 8800GTS.
 
Apparently the marketing gimmick worked because now I have people telling me that I don't know what a GPU is and nVidia created the first GPU, when it just stands for Graphic Processing Unit (which doesn't even imply 3D, let alone hardware T&L). 😉
 
*sigh*

http://en.wikipedia.org/wiki/Graphics_processing_unit

Please read that.

A Graphics Processing Unit or GPU (also occasionally called Visual Processing Unit or VPU) is a dedicated graphics rendering device for a personal computer, workstation, or game console. Modern GPUs are very efficient at manipulating and displaying computer graphics, and their highly parallel structure makes them more effective than typical CPUs for a range of complex algorithms.

A GPU implements a number of graphics primitive operations in a way that makes running them much faster than drawing directly to the screen with the host CPU. The most common operations for early 2D computer graphics include the BitBLT operation (combines several bitmap patterns using a RasterOp), usually in special hardware called a "blitter", and operations for drawing rectangles, triangles, circles, and arcs. Modern GPUs also have support for 3D computer graphics, and typically include digital video-related functions as well.

Sealboy might have something with the Hardware T&L being a requirement for the GPU though...

The first company to develop the GPU is NVIDIA Inc. Its GeForce 256 GPU is capable of billions of calculations per second, can process a minimum of 10 million polygons per second, and has over 22 million transistors, compared to the 9 million found on the Pentium III. Its workstation version called the Quadro, designed for CAD applications, can process over 200 billion operations a second and deliver up to 17 million triangles per second.

But that doesn't seem to be entirely accurate to me, because Amiga had a dedicated chip for graphics processing, so why does that not qualify?

Anyway, bah. Whatever. It's a stupid terminology argument anyway so meh. 😀

Thanks! Typical case of saying something and thinking something else.
Well, as i see, i was wrong with both. To me a GPU needs to be programmable. The term GPU includes the term processor, which actually suggests processing which doesn´t require it to be programmable. Which brings me to where i was wrong - to assume that a processor has to be programmable to be a processor. It doesn´t have to be. I guess i was thinking about PC CPUs.
Basically i confused programmable chips and non-programmable chips. Are there terms to differenciate between the two?
I mean, as mpjesse pointed out, the Geforce 3 was programmable. By that it differenciates itself from the rest of the GPU crowd, still its just a "Geforce 3" or a NV20. Now since we´re at it, what is a programmable chip called?
 
Well, apparently it's not an easy argument to lay to rest being as how wikipedia and webopedia say 2 different things pretty much. 😀
 
Actually, nvidia bought 3dFX, that's why they can use SLI as the name of their multi-gpu configuration (well, it's based on the 3dfx design anyway - that might explain why it works better than Xfire).

Btw, Nvidia did a SLI on a single card...the 7950

actually the SLi of 3dfx isn't the exact same thing as nVidia's SLi. SLi for nVidia stands for scalable link interface while SLi from 3dfx stands for Scan-Line InterLeave. I'm not sure if they worked the same way or not.

P.S 7950 GX2 = failure
 
Technically speaking, SEALBoy is correct.

Prior to the introduction of the Geforce256 and hardware T&L, video cards were referred to as graphic accelerators, nothing more. This was due in large part to the limited number of things they actually accelerated or offloaded from the CPU.

Hardware T&L allowed the graphics core to accelerate other things, such as OpenGL's lighting and matrix operations.

Definition of: T&L

(Transformation & Lighting) Moving 3D objects on screen and changing the corresponding lighting effects. Transforming these 3D matrices many times per second as the objects move and recomputing their shadows each time takes an enormous amount of processing. Hardware T&L offloads these functions from the system CPU into the display adapter or some other board, which enables a greater number of polygons to be processed to create a more realistic effect.


Future generations, such as the Geforce3/Geforce4 allowed fully programmable GPU's with hardware vertex shaders and fragment shaders. With the introduction of the shaders came Cg and corresponding shader languages, such as HLSL and GLSL. The rest is history.

As an afterthought, I currently have both, a Diamond TNT2 Ultra and an Asus Geforce256 DDR in storage. Oh the memories.
 
SLi for nVidia stands for scalable link interface while SLi from 3dfx stands for Scan-Line InterLeave.

Nvidia SLi processes with two methods, either
*the upper half is processed by one card, and the bottom half of the screen is processed by the second

OR

*odd frames on one, even frames on another.


3dfx SLi:
*one card for the odd-numbered scanlines, one for the even-numbered scan lines.

Amazing that i still remember this.

As an afterthought, I currently have both, a Diamond TNT2 Ultra and an Asus Geforce256 DDR in storage.

I still have a GeForce MX 200 32MB, and a Savage 4 16MB. I remember playing Couter-Strike beta 6 on the Savage 4, and it was sweet.
 
i've decided i'm going to buy 1 card from each 3dfx generation off ebay. i'm already bidding on the voodoo1 and voodoo2. then i'll move on to the 3000 and 5000 series

i still own two voodoo1's, two voodoo2 12mb's in sli, and one or two voodoo3's 😛

3Dfx seems like the only company where fanboys unite :lol: :lol: :lol:
 
My videocards to date:
1. Some prehistoric Trident ISA video card.
2. Creative Riva TNT 16MB PCI (yes, they did make graphics cards).
3. ELSA Riva TNT2 32MB AGP (anyone remember ELSA and their 3D-glasses?)
4. Visiontek GeForce2 MX400 64MB AGP
5. Sapphire Radeon 9000 64MB DDR AGP
6. MSI Radeon 9250 128MB DDR AGP
7. nVidia GeForce 6100 IGP 128MB shared (current :cry: :cry: :cry: )
8. Upgrade soon... I hope.
 
Haha! See what I started by saying GPU?

I really only typed it because I was too lazy to type Graphics Accelerator.

Hmm..
I think I started on a Trident 4x AGP card with 4mb memory.
then, the Diamond Riva TNT2 Ultra 32MB, then my 9800 SE 128MB.. the 9800 is what I bought for my current rig, other than that the previous 2 were in my Pentium II 300. I used a laptop for a long time with a GeForce2 Go.