News Nvidia's original GTX Titan benchmarked 11 years later — $1,000 card now 'barely usable' in modern titles, often beaten by AMD's sub-$200 RX 6400

But GPUs have basically stayed on the Moore's Law curve, at least until recently. So, this would be almost like comparing a 2.53 GHz Northwood Pentium 4 against a 486DX2-66.

The thing that's improved least is memory bandwidth (leaving aside HBM). However, bigger & faster caches have gone a long ways towards mitigating that bottleneck.
 
Last edited:
Kepler was a one thing at a time arch and games eventually went more and more to concurrent compute around 2016. (also was the time that SLI compatibility started to fade away) There are still games it does well in, but the list of ones it currently does badly in dwarfs any issues Arc has.

But it is nice to see a review where they are showing the architecture's full potential. A clockspeed of 1250mhz was definitely doable and reviewers generally did comparisons with the card running at less than 1 ghz. That bothered me because I was getting 60 fps in newly released W3 med-hi@4k with SLI 780tis running at 1250. It was an awesome gaming experience. I'm still using the same TV monitor.

Kepler was good for it's time. That time has passed.
 
I own one since the early days it was released, it still works today although not installed in any system at the moment. It was the longest lasting GPU I ever owned and I've been building my own PCs since 1998. I used it on a hackintosh connected to a 60hz 1440p IPS Shimian monitor up to MACOS Mojave (so for 6 years I ran the Titan GTX and was running everything at 1440p since the first day I installed that GPU in my system, I've been enjoying 1440p gaming/daily computer use for 11 years now) and also used it on windows up to that point. I built a new system in 2019 when it became impossible to use an Nvidia GPU past the Keppler generation on MAcOS . I had not much budget for a GPU at the time of the new build so I bought a cheap RX580 that was barely better in performance than my 6 year old Titan in most games on Windows and lasted me just a couple of years when gpus became scarce and too expensive due to Bitcoin mining and I had to settle for a RX 6600 for an upgrade to the barely passable RX580. I have now decided to abandon hackintoshing altogether since it's no longer a viable option since Apple abandon Intel CPUs so my next gpu will certainly be an Nvidia again. But AMD cards have come a long way and they are pretty good if you don't need all the bells and whistles.
 
Last edited:
If I had one, I'd replace it *only* because of missing features not running some games. Otherwise, such an old flagship running games like an entry level card today means is still is a good card. That's why I don't like the idea of frequent upgrades seen on most reviews, in the form of "it runs today's games at 60 fps, so it's good", which means you need to upgrade in two years. Get a stronger card, slightly above what you need (a 1440p card for 1080p gaming, f.ex.), and make your money more valuable.
 
But GPUs have basically stayed on the Moore's Law curve, at least until recently. So, this would be almost like comparing a 2.53 GHz Northwood Pentium 4 against a 486DX2-66.
Yup, that 66MHz 486 would win the match-up fairly easily!

-

Okay, I am of course jesting.

It would take the 100 MHz DX4 to achieve the feat.
 
  • Like
Reactions: artk2219
You also have to say (though you didn't) that the 2013 Titan still has relatively up to date drivers (October 2023), even though it is out of support, and can be used on Windows 11. The Radeon 200 series of the same age is not supported in Windows 11 and has had no driver update since 2022, the same goes for the Fury series released in 2015 AND the Radeon Pro Duo from 2016.

Can you imagine paying $1500 for the Radeon Pro Duo and 6 years later it is no longer supported? Actually, for professional users the last driver update was in 2021, so 5 years.
 
But GPUs have basically stayed on the Moore's Law curve, at least until recently. So, this would be almost like comparing a 2.53 GHz Northwood Pentium 4 against a 486DX2-66.

The thing that's improved least is memory bandwidth (leaving aside HBM). However, bigger & faster caches have gone a long ways towards mitigating that bottleneck.
Not just memory bandwidth... IPC has significantly increased and so has instruction set so they can do more things and do them faster.

This "news" and I'm not saying it's a bad article by that. .. this news shouldn't be a shock to anyone. GPUs will pretty much always get destroyed by newer hardware because of amount of memory, IPC and instruction sets not to mention clock speed, my 4090 easily runs at 3200mhz all day @ 100% load without ever passing 66 degrees and if I could get more power through it I could go higher because there is no clock limit on the (I believe) 2000 series and above they are entirely thermal & power not core clock limited and assuming you can get enough power and cooling they remain stable well above their shipped clocks so an extra 1ghz is not out of the realm of possibilities but the issue becomes the voltage curve and finding the right values and at a certain point the law of diminishing returns kicks in because extra speed requires significant power which increases thermals exponentially
 
  • Like
Reactions: artk2219
I bought one a few months back and I some times fool myself if I get busy what computer is on the screen hooked up to a KVM switch. I always switch to my faster gaming machine when I'm going to game.


If I'm tired or get up to get a cup of coffee and zone out and get an urge to play a game I have load up steam, Hit say Serious Sam 4 and be playing for 20 minutes before I realize I'm on the Original Titan 6 Gb card. I forgot to switch computers. Not one indications I'm on the old over the hill card.

For what it's worth she still kicks a strong life in her still.

In the story where they tested the Titan with Crysis Remastered I get better FPS than what story posted.

Tomorrow I will switch the Titan to a spare computer that has AVX instruction " Crysis 2 &3 need AVX instruction" and see how all the remastered Crysis 1,2,3 fair with the card.



I have a GTX 690 in the computer with AVX already and it just chops up the Crisis remastered releases like butter. And being as that card is an SLI dead end card the Titan should really shine.



As far as playing some of the heavy hitter demanding games of the last year or so this Titan is far from being an aged out GTX 280 that you hit the wall starting the games of today with" does not meet the minimum system requirements"
 
Last edited:
  • Like
Reactions: bit_user
Out of all Nvidia gpus, the ones I got the most out of was the 8800 and 1080ti. My 1080ti is still running in an old build and it can still play modern games fine at 7 years old. The extra vRAM certainly helps. When I upgraded in 2020 I knew it was bonkers to go with a 10gb 3080 so I went with the 24gb 3090 and it has lastest much longer while my friend got rid of his 3080 ti because 12gb of vRAM is no longer enough for higher settings.
 
  • Like
Reactions: Jtwizzle
A German review outlet has gone back and retested Nvidia's first GTX Titan graphics card 11 years after it launched to see how its performance stacks up to entry-level GPUs in 2024.

Nvidia's original GTX Titan benchmarked 11 years later — $1,000 card now 'barely usable' in modern titles, often beaten by AMD's sub-$200 RX 6400 : Read more
I think their testing methods must have something wrong.

1. PCGH claims Doom Eternal getting 17.9 frames/second on a Titan. Not 179... less than 20. Doom 2016 and Doom Eternal famously run at over 100fps on almost anything.
2. PCGH shows Witcher 3 getting 30fps, but UserBenchmark shows multiple hits of the Titan getting ~45 fps on Witcher 3
3. UserBenchmark shows Overwatch getting 111fps on the Titan, while Overwatch 2 is curiously missing from PCGH's review.

I don't care to find more supporting benchmarks, I just think something's wrong with their testing.
I've used and seen a half-dozen 10-series cards in 5 computers and they absolutely crushed 1080p gaming (60+ fps) until Red Dead Redemption 2 had to be cut down to 30 fps to hold vsync.