New Voodoo3 3000 driver!

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
V5 5500 from 00' on a 06' game 8O Unbelievable it can even produce a 25fps minumum.

Nice tribute here. You may just start dripping.
http://www.youtube.com/watch?v=DncCD6frhL8&mode=related&search=
 
actually all you needed to do was get the last update/patch for Unreal to get fog and bump mapping under direct3D (the preceding beta patch would do the same for OpenGL but was stripped from the 'released' last patch).

On a TnT, with last patch, bump mapping and 32-bit colour/texture/Z-buffer and vertex shading+curved surfaces, Unreal was a thing of beauty on a Geforce... (it looked sleek and quite fast on my Geforce4).
 
For the year it was released, it was full of breaking eye candy. Too be honest, I never bought any later versions of Unreal. After the origional, it became a mass multiplayer monster. I found little appealing about it.

I always prefered the Unreal engine and Infiltration full foldedly over the origional Half-Life's CounterStrike. Unreal's use of Glide and OGL had me like a deer stairing at headlights.
 
when you had the retail version of Unreal, the patch (2.26 if I'm not mistaken) was free... and it unlocked full eye-candy in Direct3D.


Unreal was amazing for its time. It was such a great game, and the graphics were simply amazing. Groundbreaking even.

The last one I got was Unreal Tournament. It was awesome. Couldn't get into the newer ones, though. There was just something missing.
 
Do you think if 3DFX was still alive (?) today we would all have better cards 😱 ?

I'm not sure, due to the fact that ATI and nVidia have really gone into overdrive these past few years in competition. 3Dfx also had... unconventional... cards. Like the Voodoo5. Two (or four) chips on one cards, lots of RAM, external power needed (on the 6000, I think), yet still no hardware T&L, and the GeForce 2 Ultra couldn't be beat. Similarly, their Voodoo 3 had only 16-bit color support and very varying performance depending on the API.
 
"3d accelerator" sounded soooo much cooler.

All hail 3dfx.

Owner of a Voodoo 2 (12 mb ram....woot!) ....but seriously it was the shiznitz.
 
Here you go mitch. My favorite maps were outside snow worlds.
Unreal-GlideVoodoo1flyby.jpg


2.26 with the candy 8O. Pretty good for 98'.
 
At the very least it ran on 2k - and fast. However it might be troublesome to run it on dual core or multi CPU machines, so don't forget to set affinity! The Unreal engine is known for its dislike of dual cores.

The part I liked most was the ending's 'turbo tunnel' where you went through the whole alien ship to the surface and got lost and adrift in space - best system speed test ever, I would use it as a benchmark (with FPS) to tweak best cache efficiency, CPU capabilities, AGP speed and graphic cards driver settings...
 
Comptia_Rep said:
NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO



I was really hoping this was a current article. I am very disappointed.


2 Manufacturers that should have never died was
3DFX and Creative (banshee line GFX)[/quote]

ummmm 3Dfx Made the Voodoo Banshee and it wasn't that flash when it was out, more like 3Dfx's first 3D/3D solution.

3Dfx went down because of poor management i believe, and was later bought up by nvidia.

http://www.x86-secret.com/articles/divers/v5-6000/v56kgb-3.htm

http://en.wikipedia.org/wiki/3dfx
 
i think some people are messing up GPU and API

you had nvidia:
worked with directx 3.0 and openGL
T&L was a part of directX and openGL

you had 3DFX:
their API, also called 3DFX was superiour to directX
a 3DFX card worked with directX but slow, but with a 3DFX application (like tomb raider 1) it's a rocket.
3DFX GPU had their own API, so no T&L but the chip on a 3DFX card was a GPU.
their own API and killing mood of microsoft (who were pushing their directX) ended the 3DFX api, cards & company.

correct me, but don't flame me if i'm wrong
 
i think some people are messing up GPU and API

you had nvidia:
worked with directx 3.0 and openGL
T&L was a part of directX and openGL

you had 3DFX:
their API, also called 3DFX was superiour to directX
a 3DFX card worked with directX but slow, but with a 3DFX application (like tomb raider 1) it's a rocket.
3DFX GPU had their own API, so no T&L but the chip on a 3DFX card was a GPU.
their own API and killing mood of microsoft (who were pushing their directX) ended the 3DFX api, cards & company.

correct me, but don't flame me if i'm wrong

3Dfx had Glide, there equivalent to D3D and OpenGL, it ran faster and looked better on 3Dfx cards, only 3Dfx cards could do it
 
3dfx was simply amazing.

Glide API could produce (in 16bit instead of Direct3D 32bit) amazing colours!!!
I still remember Unreal 2.26 in Glide mode...It was way ahead of its time...

When 3dfx was bought by NVIDIA I was trully dissapointed.

I was a big fan and I was the owner of:

1 x PCI 3dfx Banshee 16MB SDRAM
3 x AGP Voodoo 3 3000 16MB SGRAM
1 x AGP Voodoo 5 5500 64MB SDRAM

I still use the Voodoo 5 5500 in an old system I have and when I tried to plug it in in my Asrock 775 board (running E6400) I was amazed because many recent games were running amazingly fast for a year 2k graphics card.

Respect to 3DFX!!!!

I only wish the founder and the enginers could build a new company under a new name and produce products like they used to. Who know...Maybe they will invent something better than Glide!

:roll:
 
I was saddened when 3DFX was bought out. I still had a poster somewhere promoting the Voodoo 3.

My vid card list went like this:
Voodoo 3 3000 16MB AGP
Geforce 4 MX4000 64MB AGP
Geforce 5 FX 5600 128 MB AGP
Geforce 6 6200 OC 256MB AGP
Geforce 6 6600 GT 128MB AGP
ATI X1600 PRO 512MB AGP
Geforce 7 7950 GT 512MB PCI-E


I had a Pentium 3 500 with 512K L2 cache back in '99 and Unreal Tournament was running great on it.
 
I was saddened when 3DFX was bought out. I still had a poster somewhere promoting the Voodoo 3.

My vid card list went like this:
Voodoo 3 3000 16MB AGP
Geforce 4 MX4000 64MB AGP
Geforce 5 FX 5600 128 MB AGP
Geforce 6 6200 OC 256MB AGP
Geforce 6 6600 GT 128MB AGP
ATI X1600 PRO 512MB AGP
Geforce 7 7950 GT 512MB PCI-E


I had a Pentium 3 500 with 512K L2 cache back in '99 and Unreal Tournament was running great on it.

Ah the Katmai - a P2 with SSE instructions justified the extra 'i' to make 3 :lol: the P3 was the start of something considered 'decent' in hardware

i started with:

s3 trio64
geforce2 mx200 32mb pci
s3 savage 4/pro + voodoo2 1000 12mb pci
geforce 4 ti4200 128mb agp + geforce2 mx200 32mb pci
geforce2 mx400 64mb
geforce fx5600 128mb agp
geforce 6600gt 128mb agp
(back to my fx5600, my prolink 6600gt died)
geforce 6600gt 128mb agp (another one)
geforce 7900gt
 
i think some people are messing up GPU and API

you had nvidia:
worked with directx 3.0 and openGL
T&L was a part of directX and openGL

you had 3DFX:
their API, also called 3DFX was superiour to directX
a 3DFX card worked with directX but slow, but with a 3DFX application (like tomb raider 1) it's a rocket.
3DFX GPU had their own API, so no T&L but the chip on a 3DFX card was a GPU.
their own API and killing mood of microsoft (who were pushing their directX) ended the 3DFX api, cards & company.

correct me, but don't flame me if i'm wrong

Well, is considered GPU (Graphics Processing Unit) a chip that could perform transformations following instructions, over being simply able to accelerate the drawing of some predefined elements: as such, T&L (Transform & Lightning) were the first form of operations performed by the graphics card itself, not simply by the CPU (it would become able to move or create polygons/light sources directly, instead of relying on CPU instructions to do so).

Early 3DFX cards were graphics accelerators: the geometry was computed by the CPU, coordinates would be sent to the card along with the textures, and the card would 'dumbly' draw all the polygons following the instructions.

T&L would allow the card to create an object, and transform the object directly in its framebuffer memory, completely bypassing the CPU. This would require a somewhat evoluted memory controller, due to it not being a simple Z-buffer and texture cache anymore. Object conceptualization, along with object manipulation capabilities (or commands) and memory control would make the card 'intelligent' and thus qualify all cards able to do TnL as being 'GPU's.

Glide was a proprietary API made by 3DFX, initially inspired by SGI's OpenGL (a simple OpenGL driver was available for 3DFX cards too, and it worked wonders for full-screen operations).
Direct3D was at first merely a half-hearted attempt by MS to create their own 3D API, but it stayed for a long time in the background due to several issues:
- less flexible than OpenGL
- much slower than Glide
- API changing all the time.

At first, you had many competing 3D APIs in the Windows world:
- OpenGL, for which Nvidia had a license: third-party programmers also created 'generic' drivers for other cards
- PowerVR, which is very different than others, for its rendering is tile-based (abandoned due to very divergent rendering methods, unpractical)
- Glide, which ran on 3DFX hardware only, but it was open enough that several Glide>OpenGL wrappers appeared and gave good results
- Direct3D, which... errr... sucked.

Now remain:
- Direct3D, which is slowly copying methods and flexibility first created for openGL
- OpenGL, which is not a 3D API anyway, but a very broad graphics conceptualisation language.

Most cards now support both, but with varying success:
- Nvidia cards are OpenGL champions, and are good at Direct3D stuff
- Ati cards scream in Direct3D and used to suck in OpenGL (they got better)
- Intel does Direct3D and OpenGL hardware, but it sucks anyway performance-wise
- Sis does Direct3D only (but their IGPs are even worse than Intel's)
- Matrox is in its own little world (although they seem to favour OpenGL)

When you do 3D, you usually do the following:
1- use OpenGL
- to do extremely precise rendering,
- to run on several systems - even handhelds (Win32/Linux/BSD/MacOSX/Solaris...),
- to use some very recent hardware features (OpenGL being modular, it allows new features to be used in a snap while still using the same language)
- to do anything that requires both 2D and 3D elements at the same time

2- use Direct3d
- to run software on Windows alone
- to get hardware capabilities and rendering resolutions
- to... errr... ensure your software won't work on newer OSes?

Some 3D engines actually allow the use of both languages, a famous (Free) one being OGRE.
Sometimes, to get the best performance, a game will require the latest version of DirectX/Direct3D to enumerate device capabilities automatically but will then do the rendering in OpenGL (ID software titles, for example) - which allows the software to be run on almost any other platform, at the price of some user inputs to compensate the lack of device capabilities enumerations...
 
Just wanted to confirm... Unreal runs great on Win XP. Same with Unreal Tournament. It also runs on Vista x64 8O

Use a Glide wrapper for best image quality, but it's slower than Direct3D.
 
i think some people are messing up GPU and API

you had nvidia:
worked with directx 3.0 and openGL
T&L was a part of directX and openGL

you had 3DFX:
their API, also called 3DFX was superiour to directX
a 3DFX card worked with directX but slow, but with a 3DFX application (like tomb raider 1) it's a rocket.
3DFX GPU had their own API, so no T&L but the chip on a 3DFX card was a GPU.
their own API and killing mood of microsoft (who were pushing their directX) ended the 3DFX api, cards & company.

correct me, but don't flame me if i'm wrong

(As some have mentioned)

An API is an Application Programming Interface. It's the means by which engineers send commands to the GFX card. As long as the card supports the interface, then any software designed for that interface "should" work.

3DFX never had a 3D GPU: they used RISC processors. nVidia was the first to market with a GPU, and 3DFX tanked before they could return fire.

DirectX is a game framework specific to MS windows. OpenGL is an open API. D3D is a proprietary API that is a component of DirectX. Glide was a proprietary API that was only available on 3DFX hardware.

While Glide was good, it was really tailored specifically for 3DFX hardware, which is why it performed so well. It was tailored specifically to the inner workings of the 3D card. D3D and OpenGL are more abstract interface implementations that try and cover the broad range of capabilities, and that comes at a slight performance cost.

So in short, Glide was made games soar, but not because it was a great API, but because it forced the developer to interact with the card in the most optimal way. If anything, it was good that it died. In fact, as good as D3D is, it should also die (or be merged with an open standard). There's no benefit to having both OpenGL and D3D.

Keep in mind that most of the brilliant minds that were working for 3DFX are now working for nVidia. If you're wondering where 3DFX would be right now, nVidia is a good place to start looking.