The five worst Nvidia GPUs of all time: Infamous and envious graphics cards

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
What I recall reading about it suggested that it was less of an issue than how it was often portrayed. Games had to use almost the GPU's whole memory capacity, before they'd hit the slow memory bank.
I used one for around 5 years and never ran into an issue. While it was flagrant deception I'd argue it's not anywhere near as bad as the two 3060s or 1060s. The card was a fantastic price v perf buy and nvidia should have just sold it as a 3.5GB part.
 
  • Like
Reactions: jlake3 and bit_user
The Radeon 5800 was nicknamed "dustbuster" because it was one of the first dual slot GPUs, needed a godawful amount of air to stay cool (and the blower-type cooler was awful-looking and very noisy), and performed quite badly everywhere. The µarch was a DX8 chip with some extensions to support DX9 features, but it sucked at it - people who had Geforce 4 (non MX) cards could get better performance for lower prices, and AMD's R300 chips were silent, cool and deadly. Nvidia made things "right" with the Geforce 6xxx, but at the time, most people didn't really consider the 6800; the 6600 Ti was the one to get, and non-supercache 6200 were a game of chance : you might be able to flash a 6200 to 6600 levels of performance, and in some extremely rare cases, overclock them to 6600Ti clock speeds.
Nvidia's start with shady practices was with the Geforce 2 "Ti" variants : they claimed a 50% performance increase and bumped up their cards' prices accordingly, when most of the performance came from their newer "Detonator" drivers - people who had Geforce 2 cards that installed these drivers got almost the same performance. For that reason, I don't consider the RTX3xxx and RTX4xxx as "the worst" : it's merely the norm when coming from Nvidia.
The Geforce 4 "MX" generation was also massively misleading : they were Geforce 2 (DX7) level cards with a revamped VRAM controller, where "regular" Geforce 4 had that same RAM controller and DX8 support.
And one that was mentioned in the comments but should have qualified it as horrendous and gotten it one of the top spots : failing solder on Geforce 8600 mobile.
 
What a strange article.

So, the 3080 was bad because......................people couldn't buy it???
Well, first off: perception isn't very objective.

But then there was always that impression, that Nvidia didn't do enough to protect gaming GPUs from being abused for mining.

Of course, I doubt there was much that Nvidia or OEMs could have really done to avoide that and then that type of "gamer altruism" isn't really what stock owners would appreciate.

Perhaps one of the higher values of this article is that it covers the ex-post perception of the various Nvidia generations, which can evidently turn out quite differently from the technical merits at the time.

Otherwise it would really just be a rehash.

Like often cards became "bad" just because the competition managed to jump ahead.

So here other outside factors completely stole the show and thus the 30 series will be remembered as the card everybody wanted and very few managed to get.
 
My worst Nvidia was a 780 from MSI, a huge and impressive triple fan dual-slot that still looks like a beast.

I got it together with an R9 290X in a "consulting work for hardware" deal and put the 780 on an Intel Penryn quad core, the 290 on a Phenom II x6 at the time. I was ready to have both teams fight it out under my desktop and turn my heating to electric.

Unfortunately, both turned out to be very little fun, the 780 would just crash all the time, but I never really got the time to figure out what was wrong with it. And I didn't even suspect the GPU as the culprit, thought it was a power issue more than anything.

Replaced it with a GTX 980ti after a year or three and tried passing it on to one of my sons on an entirely different system... where it crashed just the same, turning my suspicion finally onto the GPU.

And there the Internet by now had revealed that this entire SKU was quite simply factory overclocked a tad too high.

Using Afterburner to lower the factory overclock ever so slightly made it 100% reliable... years later.

Meanwhile the 290 had a little switch on it, might have said "turbo" on it somewhere, so I turned it on before installation... and then promptly forgot about it.

I was far too busy to do a lot of gaming at the time, but when I did, it just turned into a jet engine: the noise was intolerable, but the quiet 780 next to it kept crashing...

Years later again that card went into a system built from cast-offs for my father in-law, where I wondered what that switch was for and toggled it in the other position...

Loo and behold, all of a sudden that card was as quiet as a cat even while gaming! And not noticeably slower to boot!

As a four decade veteran of PC home-growns I felt rather humbled that I never managed to get the proper value from that pair while I had everything to make that happen at hand!
 
"Dishonorable Mention"

For me that's the best part of the article!

And yes, I hated all those proprietary features to the point where I wouldn't even have bought Nvidia, if for me GPUs weren't paid for by ML work where VRAM size and CUDA capabilities still rule supreme.

For me DLSS finally turned out to be quite a gaming changer: it quite simply makes the difference between playable frame rates at 4k even on a 4090 and having to choose between lower resolution (no fun at 42") or mushy frame-rates (no fun at any resolution).

It has also revived some of my older cards like a 2080ti or a 3070 on the smaller screens my kids use.
 
thought the GT 1030 DDR4 would be in that list..
I still recall some youtube comments, this actually fooled some people. As it is sold at a similar price as the better 1030 GDDR5 version.
 
Last edited:
  • Like
Reactions: bit_user
Meanwhile, Vesa's Adaptive Sync and AMD's FreeSync (based on Adaptive Sync) did the same thing and didn't really cost anything extra,
'Meanwhile' meaning 'a year later, and another year after that to solve the ghosting issues from static pixel overdrive that came from abusing stock eDP adaptive sync (pixel overdrive needs to vary as refresh rate varies to avoid undershoot and overshoot. G-Sync modules used per-rate LUTs, freesync used a single value for all pixel refresh intervals)'.
Then you have 'Freesync Premium Pro' AKA 'you get to use HDR and Freesync at the same time', which requires proprietary implementation in games or does not work at all. It too arrived some time after G-Sync's HDR implementation, which ironically does not require games to implement a proprietary API.

Look, we get that sometimes a company wants to move the market forward with new technologies, but these should be the same technologies that the game developers and graphics professionals want, not just arbitrary vendor lock-in solutions
LOL. DLSS, RT and G-Sync stick around to this day because that are 'technologies that the game developers and graphics professionals want' (and that consumers want). Try and push some proprietary feature that is not desired, and it will flop and flop hard (e.g. Mantle).
It's pretty much the Graphics Technology Development Cycle at this point:
1) Nvidia introduces some new technique
2) It's decried as useless and proprietary and nobody actually wants it anyway and it'll just fail and Nvidia smells
3) New technique is implemented in games
4) New technique is widely popular with developers and consumers (but is somehow still simultaneously unwanted and worthless)
5) Other vendors implement new technique once it has widespread adoption
6) "Of course [New Technique] is the future, everyone always wanted it!"

Variable refresh rates, GPGPU, GPU accelerated raytracing, AI-accelerated upscaling, frame generation, etc. The cycle has played out time after time, and will continue to do so.
 
  • Like
Reactions: jbo5112
4090. Yes, it is a technological marvel, but I wouldn't get one.
Fire hazard, I don't think anyone that has this card leaves the mancave without turning off or putting the PC to sleep that is if you want to gamble with your house.
I believe they are going replace the power interface with another technology or revert back to the previous one too.
 
  • Like
Reactions: Amdlova
This list completely ignores severe shortcomings in a variety of models in order to replace them with "this card sucks because you couldn't get one"?

Who writes this drivel? Seriously?

Even in semi recent history the 900 series with either too little VRAM or more than the card could actually use? Cool and Quiet?

C'mon guys. I would assume some of the writers went to school for this. Tap into that higher learning. Try again. LAME!
 
Riva TNT2, Geforce 4 MX and all Geforce FX line were bad cards. Geforce 4 Ti line was not bad but at that time ATI had a monster called 9700 Pro.

The best GPU in terms of performance per dollar was Geforce 6600GT. Absolute beast, it could run Doom 3 on Very High settings with 60+FPS for as little as 200 bucks.

And we also had Geforce 6800LE (?) which allowed you to unlock pixel pipelines via software and transform it into a 6800GT. What a time to be alive.
 
"The [Nvidia] NV1 was the most stupid, wrong-headed thing anybody could have possibly built at that time. It was so bad it would have poisoned the industry if it had become a predominant thing– it was horrible." - John Carmack on Nvidia's NV1 card

The "Nvidia NV1 / STG2000" was so bad, I'm surprised the company survived, and should have easily claimed first spot on Nvidia's all time worst. It ran on quadrilaterals instead of triangles, and bundled a lot of mediocre features, driving the price up significantly. Texture mapping on quadrilaterals was such a mess that Nvidia had to offer to help developers.

A year later, Microsoft released Direct3D, which uses triangles, and effectively killed the product line & the sequel. It forced a messy (and quite buggy) software translation layer to convert triangles (i.e. add a zero-length side), which usually (always?) ended up with horribly warped textures. The only fix was for developers to provide custom textures to support a poorly performing card. Wikipedia only lists 8 supported games for the card.

The NV1 deserves to be on the list way more than the 3080, whose biggest problem was that it was popular enough that people would spend $2,500 to get one.
 
The [Nvidia] NV1 ...

It ran on quadrilaterals instead of triangles, ...
Well spotted.

...however, what it actually rendered were quadratic patches! This is an important distinction, since it goes some ways towards explaining why they did it. Like with modern tessellation, you can use it to reduce model geometry, substantially.

The insane part was believing they could steer the entire industry away from using triangles, not that quadratic patches were such an inherently flawed idea. Yes, if your models and code aren't designed to use quadratic surfaces, then the temptation exists to treat them simply as flat rectangles - or even triangles. However, that's very wasteful of the hardware's power, and really not how it was meant to be used.
 
"The [Nvidia] NV1 was the most stupid, wrong-headed thing anybody could have possibly built at that time. It was so bad it would have poisoned the industry if it had become a predominant thing– it was horrible." - John Carmack on Nvidia's NV1 card

The "Nvidia NV1 / STG2000" was so bad, I'm surprised the company survived, and should have easily claimed first spot on Nvidia's all time worst. It ran on quadrilaterals instead of triangles, and bundled a lot of mediocre features, driving the price up significantly. Texture mapping on quadrilaterals was such a mess that Nvidia had to offer to help developers.

A year later, Microsoft released Direct3D, which uses triangles, and effectively killed the product line & the sequel. It forced a messy (and quite buggy) software translation layer to convert triangles (i.e. add a zero-length side), which usually (always?) ended up with horribly warped textures. The only fix was for developers to provide custom textures to support a poorly performing card. Wikipedia only lists 8 supported games for the card.

The NV1 deserves to be on the list way more than the 3080, whose biggest problem was that it was popular enough that people would spend $2,500 to get one.
Yes it was, but how big was the impact? I mean, the market share for nvidia was so small and the brand reputation was almost non-existent.

The situation was very different for Geforce FX series. We even had that <Mod Edit> branding "The way it's meant to be played" hyping the whole FX series even before they came out.
 
Last edited by a moderator:
  • Like
Reactions: bit_user
Yes it was, but how big was the impact? I mean, the market share for nvidia was so small and the brand reputation was almost non-existent.

The situation was very different for Geforce FX series. We even had that <Mod Edit> branding "The way it's meant to be played" hyping the whole FX series even before they came out.
The impact to Nvidia was
  1. laying off 70% of their company
  2. ditching their upcoming product
  3. having to deliver a new chip from scratch on 9 months in funding
9 months is impossible to do a tape-out. Instead, they have to spend 1/3 of the cash on chip emulation software from an unproven startup, that only ever has Nvidia as a customer. The purchase means delivering a product in only 6-months without ever making a physical prototype. The first physical hardware would be in a batch of 100,000 chips. They also have 89 competitors with head starts, and they have to beat them enough to turn a profit.

If they fail, Nvidia, the largest chip company in the world, is barely a blip in the history books.
 
Last edited by a moderator:
  • Like
Reactions: bit_user
The FX series was hot garbage in anything DirectX 9. I posted something like this in another one of these threads but all it takes is a picture (worth a 1000 words). Keep in mind the Radeon 9000 series came out BEFORE the FX series.

5517.png
Oh boy was it hot garbage in DX9. I "upgraded" from a geforce 2 to a 5600XT (three whole generations!) and performance went down in Homeworld 2. Why? Because it automatically switched to a DX9 renderer! It wasn't just slower either - it was pretty much unplayable.

Not helped by the scam that the 5600XT was actually the slowest card in the lineup - below even the 5200! At least it was cheap ...

I had to put up with that POS until the 6000 series came out. Now that was a proper DX9 GPU.
 
I'm going to stake out a rather controversial position and claim the only real problem with the RTX 4060 Ti is its price. If the same exact card had been sold as the RTX 4050 Ti and at $100 cheaper, I think it'd be quite popular. The bus width and memory capacity would also make much more sense, since the x50 tier is usually 128-bit and the RTX 3050 also had 8 GB.

Also, I appreciated the dishonorable mention of their proclivity towards proprietary technologies. Good call.
Isn't it true they shifted everything to higher numbering conventions with the 4000 series? 4070 equiv to 3060 etc?
 
I've never seen any confirmation of that (leaks or otherwise), but it sure looks that way to me.

Also not the first time, but it has bounced around a little.

80 class GPUs used to be consistently big silicon, but the 600 series was capped by a 104 chip, the 780/780Ti were a return to big silicon, but the 900 series initial launch was capped with a 104 class GPU. GTX980 was a 204 chip with the 980Ti released a year later with a 210 chip. 10 series launch was a 104 chip again, with the 1080Ti being a 102 chip launched a year later, and then the 2080 once again a 104. But the 3080 saw the return of the 102 chip for 80 class, which is why it is a little annoying that it makes this list for unavailability, when it was actually a decently priced unit at launch.

Now the 4080 (103) is a chip class in-between the 104, and 102 dies, but priced like a 102 die, and the 102 die is priced even more, which effectively moves the whole stack performance stack down while increasing the pricing tiers. So almost like two steps backwards than one,

Doesn't mean that trend will continue, since nothing says Nvidia won't launch a 102 GPU in the 50 series as an 80 class card. All depends on the competition.
 
  • Like
Reactions: Jagar123
I'd put the GTX 260 failure in lieu of the RTX 3080 as availability wasn't really Nvidia's fault. Pandemic shortages and the ETH mining crypto boom was primarily responsible. The GTX 260 was a bigger flop at its launch price. So much so its MSRP had to be slashed from $399 down to $299 just weeks after its release because of the $299 Radeon HD 4870 having similar performance. Nvidia even had to refresh the GTX 260 with a 216 core version at $279 just three months later to stay competitive with the Radeon HD 4870.
 
  • Like
Reactions: bit_user