The five best Nvidia GPUs of all time: Looking back at over 20 years of Nvidia history

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Still love my GTX 1060 (6gb) and I plan to take it apart and display it in a frame in future. Nvidia made the perfect card for masses with that gpu. (is it the most popular gpu sold as i know on steam it was there for yrs)


& the GTX 1080 ti is Nvidias mistake.
That gpu was so good it single handedly negated the 20 series....which they will never let happen again.
 
Still, the RTX 3060 stands as one of Nvidia's best midrange GPUs ever.

In what way is the 3060 "midrange"? It's a 1920x1080 75fps card, which is like Entry Level+ (entry level being 1920x1080 60fps). It may match the performance of the previous generation's midrange card, the 2070 Super, but it's still the entry level gaming card of its generation.

ccJUVLaDjBUGXFEZh5ePGE.png
 
In what way is the 3060 "midrange"? It's a 1920x1080 75fps card, which is like Entry Level+ (entry level being 1920x1080 60fps). It may match the performance of the previous generation's midrange card, the 2070 Super, but it's still the entry level gaming card of its generation.

ccJUVLaDjBUGXFEZh5ePGE.png
No, what you're describing is the RTX 3050 that cut the price $80 (24% price drop) and reduced memory capacity and bandwidth by 33%, which also resulted in 28–30 percent lower performance.

I stand by the statement that the RTX 3060 was a great midrange/mainstream card for its time. It was (in theory) $80 less than the 2060 Super and $180 less than the 2070 Super, with more VRAM that made it more balanced than the previous gen 2060.

Outside of the 3050, none of the RTX cards have actually been "entry level" or "budget" based on pricing. They start at "mainstream" and go up from there.
 
The original 8800's should Definity be number 1. They were so ground breaking compared to anything else previous to it. It made games look amazing at the time. It was almost a requirement for some titles. I remember here in Canada they were hard to find and were some of the first cards I remember being scalped.
 
  • Like
Reactions: artk2219
No, what you're describing is the RTX 3050 that cut the price $80 (24% price drop) and reduced memory capacity and bandwidth by 33%, which also resulted in 28–30 percent lower performance.

I stand by the statement that the RTX 3060 was a great midrange/mainstream card for its time. It was (in theory) $80 less than the 2060 Super and $180 less than the 2070 Super, with more VRAM that made it more balanced than the previous gen 2060.

Outside of the 3050, none of the RTX cards have actually been "entry level" or "budget" based on pricing. They start at "mainstream" and go up from there.

Yes, the 3060 could be called "mainstream", a term you used repeatedly when reviewing the 3060

The Nvidia RTX 3060 brings a new level of performance to the mainstream market--sort of

Not once did you say "mid-range". To the contrary, Toms Guide's senior editor Marshall Honorof, DID describe the 3070 (quite correctly) as a mid-range card:

While there’s something to be said for the RTX 3060 Ti ($400) as well, the RTX 3070 is, in theory, the perfect GPU for mid-range builds, attached to mid-range monitors.

So I stand my by statement that the 3060 is -not- a mid-range card, it's an entry level gaming card, which is synonymous with "mainstream" as it delivers the basics of what a "gamer" would want in the 2020s (1920x1080, full deails, 60fps) from a video card, whereas the 3050, not being a "gamer" card, is incapable of that.


BvXW74etMCgQm8Xe9xTDUG.png
 
  • Like
Reactions: artk2219
I've got to echo the others who cited the 8800GT due to it bringing an unheard of level performance for a reasonable price and the rename was even cheaper. At the very least it deserves mention in the 8800GTX entry.

The Geforce 4 series also has a special note in that it while brought pretty good performance (I think it was the first consumer card with 128MB RAM!) it was nvidia's first foray into consoles.

The TNT line was also pretty transformative given that it had 2d/3d and was actually good enough to compete with 3dfx and then was followed up by the greatest card of the era the TNT 2 Ultra.

Overall solid list, but the RTX 3060 just seems out of place given the relative impact of the rest as it was never a good value (it was okay) and was coming on the heels of overpriced cards in the first place.
 
  • Like
Reactions: Order 66
The nvidia is a mixed bag... The best GPU ever for the Poors is the 1060 6GB even today still selling well on used market.
When as a kid only geforce I know is the MX 200 MX 400 and some FX 5200...
But In my opnion the sweat spot and the game momentum is the 7300GT to the 7600GT
 
The 970 should be there instead of the 980. A far more important card that was on par with the 8800 as maybe the best Nvidia GPU ever made.
 
  • Like
Reactions: kyzarvs
Looks like I'm not the only one questioning the 8800 GTX. The 8800 GT held the price / performance crown for some time.

Also, regarding the 3060 Beyond pricing and availability issues, it was a great card that certainly gave AMD a run for the money. So, despite not being able to get one, and it being over priced when available, it's the 5th best card from NVIDIA in the past 20 years? sure... I would have thrown in a Geforce 256 or Geforce 2, since they pretty much delivered a 1-2 KO to 3dfx.
 
The GTX680 definitely deserves to be on this list. I just helped someone finish a new build who was using that card for a decade. They couldn't believe how long it lasted. FWIW, they chose a 7900XTX to replace it. Hopefully that card lasts 10 years as well.
 
Last edited:
IMHO, Geforce 256, Geforce 6800series (No LE though), Geforce 8800GTX/GTS (Not Ultra), 750ti and 1060 6G. The high-end cards that were worth mentioning after G80 were 1080Ti and 4090.
As for the other post about the top 5 bests of ATi/AMD, I agreed with 9700pro, 7970 and Rx480 (or 470/580/590). It was very hard to nominate other 2 cards, probably HD 4870/4850 and radeon 9550? I remembered 4850 forced 9800gt to discount about ~$150
 
I can't imagine you didn't mention cards based on G92 because it was a truly legendary chip.
While 8800GTX was cool but expensive, 8800GT appeared with quite an affordable price and very close performance. Soon 8800GTS 512GB was added with the same chip under the hood and even more speed which just made 8800GTX obsolete.

Later 8800GTS 512 was rebranded as 9800GTX - and I guess it was supposed to be its place from the very beginning but for some marketing reasons NVidia pushed it in the previous gen with a weird positioning. So it's the unique case when a card was rebranded to an upper position in a lineup.

And it's not the end: the chip was move to a newer 55nm tech as 9800GTX+ and finally it found its place in the next generation as a low-mid level card - GTS250. What a life story, huh?
 
We did discuss this and ultimately felt that the GeForce SDR/DDR wasn't as revolutionary in its day as some of the others. Yes, it paved the way, but then so did the Riva TNT and TNT2 — there would be no GeForce if not for the original TNT card in 1998.

If we go by Tom's testing back in 1999, looking at the 1024x768 results as being more representative of what people were using (I did buy a 1600x1200 monitor back then, but it wasn't great for gaming until the early 2000s with faster GPUs), the GeForce 256 SDR was only 15% faster than the TNT2 Ultra. It was a bigger jump of 38% against the TNT2, though, and the GeForce 256 DDR was 32% faster than the TNT2 Ultra.

The real issue was that, back in the day, most of those GeForce features (hardware T&L) weren't really utilized until two or three generations later. So it was mostly just a modest bump in performance and paving the way for the future.

Today, over 24 years later, the original GeForce cards created the name but they're far enough in the past that I don't think most people pay much attention to them. If we do the best Intel CPUs of all time, do we need to list the 8086, just because it was first? Some would say yes, unequivocally, and others would be just as adamant in saying no. 🤷‍♂️
I don't agree - back when the Riva TnT came out, games like Unreal really did make use of the card's feature : bump mapping had just gotten started and the 16-bit colour depth of the 3dfx Voodoo 1/2/3 did look flat when looked at side by side, especially on those CRT screens we had at the time.

Yes, I owned both a Asus Riva TnT 16 Mb (nVidia chip), and a Diamond Monster 3D (3dfv Voodoo 1) and I could switch from one to the other by a simple API switch. The Ati Rage was similar, but quite a bit more expensive - how things have changed.

I do agree about the original Geforce256 : its features weren't used before DirectX 7 became prevalent, and that took some time ; when it did, the card lacked the oomph to be usable, and it's only with the Geforce 4 (MX included) that TnL became really necessary - but by then, more complex shaders were commonplace across game engines, especially with the original Xbox out, and the first two generations of Geforce (along with all other DX7 graphics cards) got the boot.
 
  • Like
Reactions: atomicWAR
Yes, the 3060 could be called "mainstream", a term you used repeatedly when reviewing the 3060. Not once did you say "mid-range". To the contrary, Toms Guide's senior editor Marshall Honorof, DID describe the 3070 (quite correctly) as a mid-range card:
I use the terms "mainstream" and "midrange" as synonymous. Sorry if that's not clear, but most places do that. Tom's Guide can use midrange and still mean the same thing, though I would say the 3070 at launch ($499 in theory) was decidedly into the high-end bracket, or at least upper-mainstream. There's been a price creep over time, and 2020-2022 did not help things, so we're now in a bit of a vague area.

Budget: used to be sub-$150, now it's more like $200-ish.
Mainstream/Midrange: Used to be $200-$350, now it's more like $300~$400.
High-end: Used to be anything above $400, now it's more like $500~$700, because...
Enthusiast/Extreme: The new $1,000 and above category that was formerly reserved for stuff like Titan.


For everyone talking about the G80/G92 and various iterations of that chip: The single entry for the 8800 GTX applies to all of these. We had to pick a representative GPU, and as the base design (G92 was just a die shrink of G80), we felt it was the overall best pick. But arguing that 8800 GT or 9800 GTX or whatever other card you want to name that used either G80 or G92 is "better" is rather missing the point: They were all good, and that's why the G80/G92 belongs on the list. Don't get lost in the forest and forget to see the trees.
 
The Geforce 256 was a milestone in the graphics department imho. It wasn't a monster no but it was a game changer in how we viewed our 3D graphic acceleration and how Nvidia's cards were named (long term).

Fully agreed. If i'm not mistaken, Creative 3D Blaster Geforce 256 Annihilator Pro, was the card that introduced us to Transformation and Lighting: a feature, which, at that time, was considered revolutionary.

I also think TNT2 Ultra deserves a mention, as a GPU that was a very strong competitor of the 3DFX Voodoo 3 chipset.
 
If only Nvidia had the 90-class cards back when the 1080ti came out. (and no, I'm not referring to Titan, I'm referring to the possibility of a card more powerful than the 1080ti following the same reasonable pricing structure that the rest of the 10 series did. Imagine a 1090 or 1090 Ti with 16GB or even 20GB of VRAM, and just below titan-class levels of GPU horsepower. (to make the titans still make an ounce of sense) I wonder how well it would hold up today. One of my friends has a 1080ti and an i7 980 (little bit of a limitation, just a little bit). Still works for everything they play.
 
  • Like
Reactions: Roland Of Gilead
For everyone talking about the G80/G92 and various iterations of that chip: The single entry for the 8800 GTX applies to all of these. We had to pick a representative GPU, and as the base design (G92 was just a die shrink of G80), we felt it was the overall best pick. But arguing that 8800 GT or 9800 GTX or whatever other card you want to name that used either G80 or G92 is "better" is rather missing the point: They were all good, and that's why the G80/G92 belongs on the list. Don't get lost in the forest and forget to see the trees.

That's fair. I still think the 8800GTX should be rated #1 over the 1080 Ti though.
 
Adding to this, Titans had all their shaders unlocked, with xx80Ti getting a little snip.
The original Titan did not use a fully enabled die. Neither did the Titan X (Pascal).

Every time I've seen someone come up with rules about what makes a 'real' Titan (VRAM size, double precision performance, fully enabled die, etc.), usually in the context of why the recent xx90 cards wouldn't qualify as a Titan, there is at least one existing Titan card that breaks that rule. The only rule I can think of is that a Titan is the most expensive, powerful Geforce card at time of release.
 
Last edited:
  • Like
Reactions: Phaaze88
🤔
I guess I'm still 'young' and missed out on some of them other greats; GTX 680 was when I took interest in DIYPC.



What highlights did they make?
I don't think folks talked about 'em too much.

More shaders and double the memory at a time wen 3GB was a max ...

Titan Black was a huge hit at the time of 780 ti , and the Titan Z was the best dual GPU card ever released at the time (two Titan blacks in one)
 
  • Like
Reactions: Order 66
Status
Not open for further replies.