Review Nvidia GeForce RTX 5090 Founders Edition review: Blackwell commences its reign with a few stumbles

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
It's pretty hilarious to read all the negative comments about the 5090 and how little it improves on the last generation and what a terrible performance value it is, while recalling these same people singing the praises to the heavens about the dominant gaming lead the 9800X3D has over the competition.

relative-performance-games-38410-2160.png

That's a 0.3% lead at the highest mainstream gaming settings vs the previous generation for a $450 8 core CPU and a 1% lead vs Intel's best.

relative-performance-3840-2160.png


Meanwhile, the 5090 is 35% faster at 4k than the previous generation halo and 72% or faster than the card 99% of gamers own (according to Steam). If you own a 2 generation old 3090, the 5090 is over 130% faster. That's not a worthwhile upgrade? 9800X3D is 2% faster than the 2 gen old 5800X3D and yet the 9800X3D is the only option if you're a gamer according to multiple posters trashing the 5090 in this thread. Switch to ray tracing at 4k and the lead only grows, with the 5090 being a pretty incredible 150% faster than anything AMD currently sells. The 5090 certainly isn't for everyone. If you don't game at 4k, you should not buy this card, and $2000 is a helluva lot of money for gaming. Power usage also, especially at idle, is surprisingly bad, though it is still among the most efficient cards ever produced because of how much faster it is than everyone else. Anyone acting like this is a meaningless upgrade for anyone that doesn't own a 4090 is just trolling or pretending they work for AMD's marketing department.
 
Thank you @JarredWaltonGPU for the detailed review, the card is priced out of my range clearly but I was never the target market for this card. Those individuals and companies who can afford and need a 5090 pretty much get what they pay for.
I'm looking forward to the 5080 review but I'm not holding my breath for any shocking leap of performance over the 4080 Super 😆
 
  • Like
Reactions: artk2219
I'm looking forward to the 5080 review but I'm not holding my breath for any shocking leap of performance over the 4080 Super 😆
It won't be. The performance of the 4080 Super is negligibly faster than the OG 4080. The only significant difference was the $200 price drop. That said, the 5080 should be a very solid upgrade over your current 3080. If it performs like a 4090, the 5080 will be twice as fast at 4k as a 3080.
 
It's pretty hilarious to read all the negative comments about the 5090 and how little it improves on the last generation and what a terrible performance value it is, while recalling these same people singing the praises to the heavens about the dominant gaming lead the 9800X3D has over the competition.

relative-performance-games-38410-2160.png

That's a 0.3% lead at the highest mainstream gaming settings vs the previous generation for a $450 8 core CPU and a 1% lead vs Intel's best.

relative-performance-3840-2160.png


Meanwhile, the 5090 is 35% faster at 4k than the previous generation halo and 72% or faster than the card 99% of gamers own (according to Steam). If you own a 2 generation old 3090, the 5090 is over 130% faster. That's not a worthwhile upgrade? 9800X3D is 2% faster than the 2 gen old 5800X3D and yet the 9800X3D is the only option if you're a gamer according to multiple posters trashing the 5090 in this thread. Switch to ray tracing at 4k and the lead only grows, with the 5090 being a pretty incredible 150% faster than anything AMD currently sells. The 5090 certainly isn't for everyone. If you don't game at 4k, you should not buy this card, and $2000 is a helluva lot of money for gaming. Power usage also, especially at idle, is surprisingly bad, though it is still among the most efficient cards ever produced because of how much faster it is than everyone else. Anyone acting like this is a meaningless upgrade for anyone that doesn't own a 4090 is just trolling or pretending they work for AMD's marketing department.
Friendly reminder.... man you are comparing the CPU performance in a GPU bottleneck scenario....
Try plug the 5090 vs 2080 in a Sandy bridge i7 and you will see exactly the same pattern in GPU.

In 1080p tests where GPU bottleneck is removed, the X3D is generally 30-40% faster than 14900k in a lot of tests IIRC, plus in the CPU market the price didn't skyrocket each gen, so the C/P progression makes more sense to upgrade than the 5090.
 
The key metric is adoption rate. Nvidia reported that 80% of gamers are already using DLSS in some capacity--probably upscaling--and every indication I've seen also points to widespread use (and getting wider). That merits DLSS/FSR/XeSS as "essential tech" and no longer a "nice to have" feature, and thus is relevant for performance reviews.
There's using it and then there's using it. I absolutely have used DLSS, but I could count the number of titles I've purposely used upscaling on with one hand (a couple of games had it turned on by default and once I realized turned it off). There are also titles where DLSS/FSR/XeSS fixes image quality even if you're not using the upscaling itself.

There are so many asterisks involved with DLSS/FSR/XeSS that while I agree it's a very important feature I don't think it's even remotely relevant for video card reviews. If there was just a singular setting that provided the exact same results no matter which card you used then it would, but that isn't where we're at today. This technology deserves a completely separate look which compares image quality (and performance where possible) with one another.
FG is a more controversial part of DLSS, but recall that upscaling itself was controversial for its DLSS 1 & 2 days. Starting with DLSS 3, upscaling was finally good enough to be considered for default use.

I consider FG on the same maturation curve. Its one caveat is a high-enough base framerate. It won't make a slow PC faster, but it can give a decent PC a smoother and thus better gaming experience overall.
Frame generation will likely become a mandatory feature to hit very high refresh rates on higher resolution monitors or for just evening out overall frame rates. As a technology it's nowhere near ready though and we're already on nvidia's second full generation. DLSS 2.x resolved the inherent issues with DLSS 1.x and it's just been smoothed out from there as time goes by. Until frame generation can be looked at the same way as upscaling is where you're pretty much going to get the same experience in most titles it's stuck to being evaluated on a per title basis.
 
  • Like
Reactions: adbatista
>I remember when video games were fun and accessible to everyone.

Games are still accessible, even AAA games on $300 GPUs. You just have to manage your expectation and dial the details down.

Black Myth Wukong on RTX 4060. Looks pretty decent to me.

For a fun read on the state of gaming, from the industry's standpoint, have a read below.

PRESENTATION: The State of Video Gaming in 2025
https://matthewball.co/all/stateofvideogaming2025

Who is Mathew Ball?
https://matthewball.co/about
 
It won't be. The performance of the 4080 Super is negligibly faster than the OG 4080. The only significant difference was the $200 price drop. That said, the 5080 should be a very solid upgrade over your current 3080. If it performs like a 4090, the 5080 will be twice as fast at 4k as a 3080.
No doubt it’s gonna be a huge leap compared to my 3080 but I’m not so enthusiastic about paying €1200 for a 16GB card.
 
  • Like
Reactions: helper800
No doubt it’s gonna be a huge leap compared to my 3080 but I’m not so enthusiastic about paying €1200 for a 16GB card.
I am using 3070Ti, kind of tempted to upgrade, but $1000 for a founders 5080 is... let's forget it, getting a $1000 camera lens can take nice photos and not out resolved by the new cameras for 10-20 years, but nowadays a $1000 card can let you game at ultra at... 3 years at best
 
  • Like
Reactions: Elusive Ruse
"Yeah, that’s impressive to me and worthy of a 4.5 star score."
For you sure, it must've been an impressive experience. Reality is that these products a decade + ago were generally affordable and thus, exciting products. The 5090 is not an exciting product. These will land in the hands of less 0.5%(?) of the gaming DGPU market, marking new all time lows for flagships.
Great performance PLUS great price = a highish score.
Great performance PLUS horrendos value cannot = a high score too!
Scalpers haven't even sunk their teeth in, and the new tariffs will probably push these into $4K territory. Who knows, who even cares any more. The whole thing has become a parody! And you give it 4.5..... come on...
 
I am using 3070Ti, kind of tempted to upgrade, but $1000 for a founders 5080 is... let's forget it, getting a $1000 camera lens can take nice photos and not out resolved by the new cameras for 10-20 years, but nowadays a $1000 card can let you game at ultra at... 3 years at best
I recently changed my approach to upgrading my rig, for years I had neglected my sound quality and mostly relied on headphones to compensate. The other day I bought a pair of studio monitors for €400 and hooked them up to my audio interface; I was blown away by how much better they sounded than my old 2.1 setup and at such an incredibly reasonable price point.
 
This is a really underwhelming "next-gen" but then you realize any game can be played today maxed with basically any card from the last 10 years, and more importantly, you remember games are just 1 in a infinite deck of experiences to live. So everyone just buy the GPU you can afford and move on
 
Friendly reminder.... man you are comparing the CPU performance in a GPU bottleneck scenario....
Try plug the 5090 vs 2080 in a Sandy bridge i7 and you will see exactly the same pattern in GPU.

In 1080p tests where GPU bottleneck is removed, the X3D is generally 30-40% faster than 14900k in a lot of tests IIRC, plus in the CPU market the price didn't skyrocket each gen, so the C/P progression makes more sense to upgrade than the 5090.
That's exactly the point. 30-40% faster at 1080p? What are you talking about? It's less than half that on average, about 14%. And that only applies if you pair it with a $2000 GPU. You just bought your brand new halo GPU, and now you have to play at 1080p to see your CPU perform better than the competition. That makes zero sense. The 9800X3D is a "halo" product with a halo price that doesn't perform any better at halo settings.
 
Friendly reminder.... man you are comparing the CPU performance in a GPU bottleneck scenario....
Try plug the 5090 vs 2080 in a Sandy bridge i7 and you will see exactly the same pattern in GPU.

In 1080p tests where GPU bottleneck is removed, the X3D is generally 30-40% faster than 14900k in a lot of tests IIRC, plus in the CPU market the price didn't skyrocket each gen, so the C/P progression makes more sense to upgrade than the 5090.
I thought the same thing, this guy is dishonest.
 
What can you do with 4-way RTX 5090 and 128GB of VRAM for training AI models? I don't know enough to say what's possible, but it's definitely more useful than 4-way 4090 with 96GB of VRAM. LOL
Nothing for training AI)
These models have no practical application - everything you wanted to train on this equipment you will train either on a calculator for a much lower cost - NV itself recommends jetson soc with 256 Cuda cores for a start and it works. For commercial application models you need other computing resources, and it is not 30/40/50-90.
Why was 4090 banned from sale to China? Because there were ways to make that work, not as equivalent to H100 but as something that was usable with the right software stack. There are images of racks of systems with 4090 cards, and I'm sure they're not trying to do crypto mining on them these days. They're being used to research AI because if you can get a $2000 anywhere close to matching Nvidia's data center $50,000+ cards? It becomes an interesting business model.
You're a bit out of the loop. The usual usage in China was desoldering chips and soldering them into custom boards with 48+GB VRam, not using these RTX 4090 GPUs. The same thing will be done with this chip - all they need is the chip itself, not the 40/50-90 product that is being evaluated here.
 
A 4.5 star product means it delivers some impressive gains in performance as well as features and technology, not just that it’s a great value.
As for the shiny new features - the usefulness of the presentation itself cannot be verified at this point - they have not been used anywhere. This applies to all features that require progress in game development and game engines. No one except NV demonstrators will implement this - we see this even in the "new" features that were presented in Ada - OMM, SER, etc. - these are implemented either in sponsored demonstrators like CP2077, or in Portal RTX (made by Nvidia). By the way, there is no news about other new Remix RTX games...
So they should not affect the evaluation of the product in any way
 
No not really. Consumer products used forAI are just home users or self employed contractors doing GenAI. Artists or other content creators.

Companies who want to do that or more just rent it from a provider. Basically AIaaS is in full swing now, I'm getting pelted dozens of times a day with them trying to sell us on their services. While our executive team and business units are chomping at the bit, our CIO has opted for a slow and steady approach.

Nobody and I mean nobody is running a business on a bunch of thrown together 4090s. Then liability alone would have the lawyers screaming at us not to mention CAPEX vs OPEX. Medium and larger businesses just do not operate that way. Taking on major liability to save a measly 20 or 30k, it's just really bad business. Instead you contract it out to a service provider complete with SLAs and legal obligations. That service provider will own and operate the physical infrastructure and they aren't using 4090s.
great post and needed on these forums to show what is done in the corporate environment.

It's pretty hilarious to read all the negative comments about the 5090 and how little it improves on the last generation and what a terrible performance value it is, while recalling these same people singing the praises to the heavens about the dominant gaming lead the 9800X3D has over the competition.

relative-performance-games-38410-2160.png

That's a 0.3% lead at the highest mainstream gaming settings vs the previous generation for a $450 8 core CPU and a 1% lead vs Intel's best.

relative-performance-3840-2160.png


Meanwhile, the 5090 is 35% faster at 4k than the previous generation halo and 72% or faster than the card 99% of gamers own (according to Steam). If you own a 2 generation old 3090, the 5090 is over 130% faster. That's not a worthwhile upgrade? 9800X3D is 2% faster than the 2 gen old 5800X3D and yet the 9800X3D is the only option if you're a gamer according to multiple posters trashing the 5090 in this thread. Switch to ray tracing at 4k and the lead only grows, with the 5090 being a pretty incredible 150% faster than anything AMD currently sells. The 5090 certainly isn't for everyone. If you don't game at 4k, you should not buy this card, and $2000 is a helluva lot of money for gaming. Power usage also, especially at idle, is surprisingly bad, though it is still among the most efficient cards ever produced because of how much faster it is than everyone else. Anyone acting like this is a meaningless upgrade for anyone that doesn't own a 4090 is just trolling or pretending they work for AMD's marketing department.
a 9800X3D being 2% faster than a 5800X3D at a gpu bottlenecked res 4k does not tell the whole picture.

And ignores the 40% increase in single threaded speed between them which you will feel even on the desktop. The narrative that people only play games on their pc's also not accurate.
 
  • Like
Reactions: TCA_ChinChin
No one takes "value

In volume The 5090 compared to the overall GPU market in its entirety is so ridiculously small. This is like going on a car forum and complaining about lambo costs or a watch forum and complaining about Rolex. People have been yelling and telling me the same thing back when I got a GTX 280 and 295 Almost 20 years ago. "why you buying that man it's so terrible, no value" Cool Bro. "value" is not something people think about at this level, and honestly shouldn't be thought about. At this level, you either want it or you don't. And there's nothing wrong with being in either of those camps.
Comparing a GPU to a Rolex is RIDICULOUS. I can get $20 watch that does the EXACT same thing as a Rolex. A Rolex is a luxury product. A GPU isn’t.
 
a 9800X3D being 2% faster than a 5800X3D at a gpu bottlenecked res 4k does not tell the whole picture.

And ignores the 40% increase in single threaded speed between them which you will feel even on the desktop. The narrative that people only play games on their pc's also not accurate.
See my follow up post. I addressed your points. Your last sentence is completely contrary to the reality of how the community pushes X3D CPU's. It is common to see posts by people questioning why AMD even sells non-X3D CPU's as if the only thing anyone does on their PC is gaming and nothing else. I am well aware most people don't game at all on their PC's.
 
"Yeah, that’s impressive to me and worthy of a 4.5 star score."
For you sure, it must've been an impressive experience. Reality is that these products a decade + ago were generally affordable and thus, exciting products. The 5090 is not an exciting product. These will land in the hands of less 0.5%(?) of the gaming DGPU market, marking new all time lows for flagships.
Great performance PLUS great price = a highish score.
Great performance PLUS horrendos value cannot = a high score too!
Scalpers haven't even sunk their teeth in, and the new tariffs will probably push these into $4K territory. Who knows, who even cares any more. The whole thing has become a parody! And you give it 4.5..... come on...
Reviewers have always been part of the problem, never the solution. They never see the forest for the trees. Look at all the other tech sites giving this thing "gold" awards.🤡
 
  • Like
Reactions: barryv88
Did you bench into an open bench or a PC case? I am asking because there is some major concerns of overheating because the CPU coolers is choked by the 575W of heat dissipation inside a closed PC Case. If you have an HSF for your CPU, then you are screwed.

7a5bf4d586b20ffe0aa6281c57d419012a32cbdabd43b3e8050d2aa9a00d6cc1.png
What review is this? Trying to find it.
 
That's exactly the point. 30-40% faster at 1080p? What are you talking about? It's less than half that on average, about 14%. And that only applies if you pair it with a $2000 GPU. You just bought your brand new halo GPU, and now you have to play at 1080p to see your CPU perform better than the competition. That makes zero sense. The 9800X3D is a "halo" product with a halo price that doesn't perform any better at halo settings.
That's exactly what I don't agree on your arguement, you bought the halo CPU which could max out your next few halo GPU purchase at 4k, that's where the point you buy a halo product, to stay at the top attainable performance longer. Yes the 9800X3D is expensive now, but you can be assured at least in 6090 or even 8090 it won't be bottlenecking your rig in 4k gaming. But for this Halo GPU? you actually dont even able to max out the mid range of last 2 gen of CPUs, that's called paying for bottlenecking at Halo usecase. That DLSS frame gen is never a true HALO feature, yes it gens a ton of frames to smooth out your graphics, but there are glitches here and there where you could notice in games during sessions, it isn't always popping up, but paying $2000 for a card where at the year of release required those frame gen to play new titles smoothly at 4k, while in the meantime having annoying glitches (e.g. starwars outlaw in some youtube reviews having weird textures now and then, flight sime altimeter ghosting)

See my follow up post. I addressed your points. Your last sentence is completely contrary to the reality of how the community pushes X3D CPU's. It is common to see posts by people questioning why AMD even sells non-X3D CPU's as if the only thing anyone does on their PC is gaming and nothing else. I am well aware most people don't game at all on their PC's.
And those ppl don't even buy a GPU let alone the 5090...
 
  • Like
Reactions: TCA_ChinChin

TRENDING THREADS