The 30 Year History of AMD Graphics, In Pictures

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


The problem with 3dFX was that they were mismanaged and weren't really innovating anymore. IIRC, you needed 2 Voodoo cards if you wanted 2d and 3d textures (one for each). That made them especially expensive, while ATI and nVidia had cards that could handle both, albeit, slower in overall performance, they were cheaper and performance wasn't that far off. I'd say that the primary IP nVidia gained from the purchase was SLI, and these days, nVidia doesn't seem to really care much about it.

As for the OEM card issues...OEM's were and are shady, which is why I will never own an OEM desktop ever again.
 


I don't think that happens much anymore depending on the type of PC, especially at the higher levels. If it's a gaming-marketed PC, generally they have reference GPUs straight from AMD or Nvidia. Dell shoved an Nvidia reference GTX 970 into the XPS 8900 if optioned with one.

A friend of mine recently was in a bind and rushed to Best Buy and bought an HP Omen "gaming" PC and it came with the Nvidia GTX 1070 Founder's Edition, the exact same model as what you can buy direct from Nvidia's website.

Now if you want to talk about the crappy memory, power supplies, and motherboards OEM PC venders use, then those are another matter.
 

Karadjgne

Titan
Ambassador
Dell has a bad habit of using Delta psus in its higher end pc's, mainly the Alienware, kinda blowing the crappy theory. Also, HP and Dell both use Asus, MSI and Gigabyte as OEM for their mobo's. It's generally not the hardware that's crappy, but what Dell and HP etc do to the hardware via bios or design restrictions. My X800GT for instance, was an actual reference X800GT, but I can only figure that one of the memory ic's failed QI, so instead of being junked, it was factory chopped and a Dell firmware added. It went from a 192 bandwidth to 128, was missing 1/4 of its vram, and you could physically see where the traces were scratched out/cut. Common fixes even included using a pencil to add the required traces back in, but it was anyone's guess if that worked or not. AMD has been doing similar for years, even in its cpus, some cpus like the X3 could be unlocked to an X4 etc. I don't mind failed stuff being repurpossed as lower end stuff, it makes perfect business sense, just tell me beforehand it's been done.
 

Luis XFX

Distinguished
Jul 26, 2013
32
0
18,540
ATI Rage 128 was my first video card. My favorite to this date is my X850XT, it was relatively cheap and could handle games through my entire time in college. I was really hoping the Vega series would be as good or better than Nvidia's current lineup.
 

BorgOvermind

Distinguished
Oct 4, 2011
46
0
18,540
I've been using ATi cards since 4MB PCI cards. I had Rage, Radeon SDR, X850XT, 3870, 4870 and I still have a 5870 on my primary system. No need to upgrade atm.
 

pbug56

Distinguished
Mar 9, 2012
37
0
18,530
The All In Wonder 7500 was a perfect example of hype over substance. It was an OK video display card for the time, but it was sold to people to use for digitizing video. I was a PC hardware novice but I saw nothing that said that the puny Pentium at the time had to do all the actual digitizing - not the hardware on the card. A 2 hour SVHS tape would crash the PC within 24 hours but it would not finish digitizing. A real ripoff. And that's before all the trouble I had with TDR BSOD's, a true ATI driver speciality.
 
Status
Not open for further replies.