News Nvidia GeForce RTX 5090 versus RTX 4090 — How does the new halo GPU compare with its predecessor?

Thank you for the analysis Jarred, I'm eying the 5080 for my next upgrade and its pricing is reasonable all things considered; yet I'm not sure if the 16GB VRAM on an 80 class is still a good deal even if it's GDDR7. I have a feeling there's a refresh or an 80Ti in the works with 20/24GB coming within the year.
 
Thank you for the analysis Jarred, I'm eying the 5080 for my next upgrade and its pricing is reasonable all things considered; yet I'm not sure if the 16GB VRAM on an 80 class is still a good deal even if it's GDDR7. I have a feeling there's a refresh or an 80Ti in the works with 20/24GB coming within the year.

5080 with 16GB of RAM should be boycotted. I'd prefer the 5070 Ti as it has the same RAM and close enough performance. I strongly expect to see a 5080 Super/Ti with 20/24GB of RAM at some point.
 
The article said:
Looking at the Far Cry 6 results, it appears the 5090 will offer about 27% more performance than the 4090 in games where the new AI features aren't part of the equation. In A Plague Tale: Requiem, the gap increases to about 43%
FWIW, I had predicted "37% to 44% ... I feel more comfortable with the lower end of that range":

While 27% is a fair bit less than I estimated, 43% falls directly in my range.

I also predicted that it would be a better value in perf/$ than the RTX 4090, which it just barely manages. They increased the list price by 25%, so if 27% is truly the typical uplift for non-DLSS usage, then it's not actually a worse deal. However, we have yet to see what near-term availability & street prices end up being like.
 
  • Like
Reactions: JarredWaltonGPU
I have a feeling there's a refresh or an 80Ti in the works with 20/24GB coming within the year.
It's possible there could be a 24 GB version, since Samsung has already developed 24-Gbit GDDR7 dies.

Normally, I'd say "no way", because the ASIC certainly has only a 256-bit memory interface on it and the price differential between the RTX 5080 and RTX 5090 is so big that I doubt they would sell partially-disabled GB102 dies a RTX 5080 Super. Maybe as a RTX 5080 Ti, but it would cost more.
 
It's possible there could be a 24 GB version, since Samsung has already developed 24-Gbit GDDR7 dies.

Normally, I'd say "no way", because the ASIC certainly has only a 256-bit memory interface on it and the price differential between the RTX 5080 and RTX 5090 is so big that I doubt they would sell partially-disabled GB102 dies a RTX 5080 Super. Maybe as a RTX 5080 Ti, but it would cost more.
Oh, I am probably 90-ish percent certain that the mid-cycle refresh for Blackwell (meaning sometime in early to mid 2026) will be the same GPUs, possibly with more SMs enabled in some cases, but with 24Gb GDDR7 in place of the current 16Gb models. So every GPU tier could get a 50% upgrade in VRAM capacity. Whether they're called "5080 Super" or "5080 Ti" or something else doesn't really matter.

The exception to this, possibly, is the 4060 Ti and 4060. Depending when those launch, it's conceivable that Nvidia goes straight to 24Gb chips and ends up with the same 12GB as the 5070, but with 33% less bandwidth due to using a 128-bit interface.

That's my guess as to what will happen at least. And I've definitely been wrong on this stuff once or twice. LOL
 
Thank you for the analysis Jarred, I'm eying the 5080 for my next upgrade and its pricing is reasonable all things considered; yet I'm not sure if the 16GB VRAM on an 80 class is still a good deal even if it's GDDR7. I have a feeling there's a refresh or an 80Ti in the works with 20/24GB coming within the year.
Based on linear scaling, we see that there are no architectural improvements in the shader compute unit. Even the power and cost scale linearly.
For the 5080, this means ~ no improvements in gaming performance for the same price...
Bad release?
 
  • Like
Reactions: LuxZg
Quite frankly I would be surprised if there wasn't a mid cycle refresh for anything $500+ with 24Gb memory IC. While it's a low volume part they're already using it in the mobile 5090 and MSI had an oopsie showing a 24GB capacity for the 5080.

There also ought to be plenty of room between the 5080 and 5090 for both a new SKU (say a Ti) and a Super refresh with higher clocks. If they pursue the same strategy as with the 40 series and retire the cards being refreshed I could see clock/core increases across the stack of the die allows for it.

I find looking at this generation a lot different since I decided after the B580 I'd wait to see what Celestial could deliver. Now it's mostly down to an academic type interest and hope they aren't screwing over people who actually need a new card at a reasonable price like this last generation did.
 
Oh, I am probably 90-ish percent certain that the mid-cycle refresh for Blackwell (meaning sometime in early to mid 2026) will be the same GPUs, possibly with more SMs enabled in some cases, but with 24Gb GDDR7 in place of the current 16Gb models. So every GPU tier could get a 50% upgrade in VRAM capacity. Whether they're called "5080 Super" or "5080 Ti" or something else doesn't really matter.

That's my guess as to what will happen at least. And I've definitely been wrong on this stuff once or twice. LOL

This is exactly what I need. My 4070Ti Super is struggling with modded SkyrimVR. 98% VRAM usage with medium to low LODs and shadows 🙁

Need some love for VR tech/features...
 
Based on linear scaling, we see that there are no architectural improvements in the shader compute unit. Even the power and cost scale linearly.
For the 5080, this means ~ no improvements in gaming performance for the same price...
Bad release?
I highly doubt there would be no gaming performance bump, I dunno how you came to this conclusion or maybe I didn’t understand your post?
 
  • Like
Reactions: KyaraM
Looking at the numbers , it is ~32% increase of everything . shaders , tensors, RT, SM, etc

and the RAW result increase is around 27% ... I dont think that we have anything new here "in real"
 
  • Like
Reactions: Peksha
Here is a good comparison video with previous gen. Looking at the RTX off performance metrics that Nvidia showed in the keynote, its not looking good.

This looks to be the perfect time for AMD with RDNA 4.

 
Based on linear scaling, we see that there are no architectural improvements in the shader compute unit. Even the power and cost scale linearly.
For the 5080, this means ~ no improvements in gaming performance for the same price...
Bad release?
Oh no, there is a 2x improvement, just use DLSS4! /sarcasm

Maybe Nvidia took the advantage they have over the competition to work exclusively on AI. AMD still needs to catch up on efficiency, AI, RT..., so why bother? Just make it a bit bigger to bump better numbers, and let the AI do the talking.

I think this is the generation where AMD has the best chance of catching up.
 
  • Like
Reactions: Peksha
FWIW, I had predicted "37% to 44% ... I feel more comfortable with the lower end of that range":

While 27% is a fair bit less than I estimated, 43% falls directly in my range.

I also predicted that it would be a better value in perf/$ than the RTX 4090, which it just barely manages. They increased the list price by 25%, so if 27% is truly the typical uplift for non-DLSS usage, then it's not actually a worse deal. However, we have yet to see what near-term availability & street prices end up being like.
So, waiting 2+ years for 2% or less performance per dollar is fine by you? We used to get 50-60% perf for same money in similar timeframe. The IT hardware has been overpriced for years, with no end in this travesty in sight. All the while gross margins keep increasing. Enjoy.
 
  • Like
Reactions: Peksha
So, waiting 2+ years for 2% or less performance per dollar is fine by you? We used to get 50-60% perf for same money in similar timeframe. The IT hardware has been overpriced for years, with no end in this travesty in sight. All the while gross margins keep increasing. Enjoy.
Better get used to it. We're pretty close to the end of 2d scaling. Feature size isn't likely to go subatomic unless we gain a pretty radical new understanding of physics. And 3d scaling is going to be heat limited unless we learn some pretty neat new tricks as well.
 
So, waiting 2+ years for 2% or less performance per dollar is fine by you?
I didn't say it was fine, or pass any other value judgement on it.

The comment I linked was on an article that warned against buying a RTX 4090, before we got more clarity on the RTX 5090 specs & pricing. We had gotten into a discussion about hypothetical performance, based solely on information about its die size and process node. I felt comfortable predicting that Nvidia would not launch the RTX 5090 at a price that represented worse perf/$ than the RTX 4090. That's a rather weak prediction, I know, but it seems I was very nearly wrong!
: D
 
Last edited:
Better get used to it. We're pretty close to the end of 2d scaling.
The immediate issue is cost. They used an oldish process node, presumably because cost & demand for newer nodes was too high. High-end GPUs have big dies, which means cost is more heavily influenced by the node used.

That said, you're not wrong that it has something to do with the end of Moore's Law. As density increases slow, we're seeing a big drop-off in the traditional price decreases per transistor, which is crucial for further scaling-up of transistor-intensive chips like GPUs.
 
I didn't say it was fine, or pass any other value judgement on it.

The comment I linked was on an article that warned against buying a RTX 4090, before we got more clarity on the RTX 5090 specs & pricing. We had gotten into a discussion about hypothetical performance, based solely on information about its die size and process node. I felt comfortable predicting that Nvidia would not launch the RTX 5090 at a price that represented worse perf/$ than the RTX 4090. That's a rather weak prediction, I know, but it seems I was very nearly wrong!
: D
Saying that worst-case performance scaling represents the 5090 is a bit unfair. I'm not saying multi framegen is going to be amazing (I haven't tried it yet), but there are a bunch of other things new to the architecture that could prove to be very useful. If the worst case scenario is equivalent performance per dollar — looking at just the cost of the GPU — then 5090 represents a win for Nvidia. And the 4090 isn't selling for under $2000 either these days (which is a scarcity / end of life problem, not because it couldn't be produced still).
 
Saying that worst-case performance scaling represents the 5090 is a bit unfair.
It was a caveated statement. I do hope typical (non-DLSS) performance improvement is better than 27%, but we'll see. For now, all I feel comfortable saying is that it's not worse perf/$.

We also have yet to see what street prices & availability are like (as you mention). Ultimately, it will be a comparison of those vs. the RXT 4090 that will determine its value.
 
It was a caveated statement. I do hope typical (non-DLSS) performance improvement is better than 27%, but we'll see. For now, all I feel comfortable saying is that it's not worse perf/$.

We also have yet to see what street prices & availability are like (as you mention). Ultimately, it will be a comparison of those vs. the RXT 4090 that will determine its value.
Yeah, and I suspect we could see prices jump a LOT in the first month or two of availability. I think there will be both rich gamers and a bunch of AI companies that will pay well over $2,000 for an RTX 5090. Considering the B200 stuff costs tens of thousands (outside of DIGITS), and this will have FP4 support, even AI developers might want an inexpensive (relative to pro / data center stuff) solution to play with.

Hopefully I'm wrong, but I could very much see the "AI bros" pushing prices in the same way as we saw with the RTX 30-series and the "crypto bros" (and scalpers). Sigh...
 
I think there will be both rich gamers and a bunch of AI companies that will pay well over $2,000 for an RTX 5090. Considering the B200 stuff costs tens of thousands (outside of DIGITS), and this will have FP4 support, even AI developers might want an inexpensive (relative to pro / data center stuff) solution to play with.
That's not all! Don't forget the additional memory bandwidth, capacity, and PCIe 5.0!!

AI bros, incoming!
 
I was going to go a 5090 but at $2000usd add $100 to $200usd for the aib models ( reference cards dont often make it to Australia unless privately imported ) then covert it to AUD and add extra taxes etc etc and im looking at $4500 to 5k aud for it !!

At that point i need to think how much do i game and how much do i really need to spend to enjoy better gaming over my 7900xtx !!

Because ive bought AMD flagships for 7 years now 5700xt 6900xt 7900xtx im faced with the reality that the 9070xt with its 16gb is probably going to ( maybe beat it in RT ) but lose to my 7900xtx and that my $1000usd flagship AMD GPU is gone now ..

So ive been thinking since ive seen both AMD's ( lack luster on upcoming GPU's presentation ) and Nvidia's presentation that the 5080 will be my next GPU ( sorry AMD you did it to yourself )

My problem also is 16gb on a 5080 is quite pathetic for $1000usd in 2025 so until i see 3rd party reviews of the 5080 to see if the 16gb is really enough im still not sure on buying one ..

Games are getting more demanding we see that with the 8gb cards that just should be banned from existing in 2024/2025 so is 16gb really going to be enough !!

My guess ( as i may of not bothered with Nvidia's crappy anti consumer BS because i was buying AMD ) Nvidia will super seed the 5080 BS 16gb at some point to maybe 20gb then i will buy a 5080 20gb at maybe the $1000usd to 1200usd price point !!

As of right now if Nvidia offered a 20gb 5080 at 1000usd to 1200usd it would be a no brainer , better RT, DLSS, 20gb of Vram ( to give me confidence that its going to be ample for quite a few years) all at my AMD flagship price point !!
 
Last edited:
  • Like
Reactions: Loadedaxe
since ive seen both AMD's ( lack luster on upcoming GPU's presentation ) and Nvidia's presentation that the 5080 will be my next GPU ( sorry AMD you did it to yourself )
My take on RDNA4 is that it's mainly a bridge to their next epoch, in which they've said they'll reunify their consumer (RDNA) and datacenter/HPC (CDNA) architectures. RDNA4 could therefore be seen as a "strategic redeployment", more than a retreat. Also, there were rumors of a die-stacked flagship GPU they cancelled, though I didn't hear whether that was because it missed cost or performance targets.

I know that's of no help to your dilemma, but I just thought I'd add my $0.02 on what they seem to be doing.

Nvidia will super seed the 5080 BS 16gb at some point to maybe 20gb then i will buy a 5080 20gb at maybe the $1000usd to 1200usd price point !!
LOL, I think you mean "supersede". I just mention that for others' benefit, because it took me a moment to figure out. Saying "super seed" in the context of talking about Super-branded cards is what threw me.

We're guessing they'll do a 24 GB RTX 5080 Super, in a year or so. Maybe sooner, if it's a 5080 Ti, but then it'll cost more.
 
My take on RDNA4 is that it's mainly a bridge to their next epoch, in which they've said they'll reunify their consumer (RDNA) and datacenter/HPC (CDNA) architectures. RDNA4 could therefore be seen as a "strategic redeployment", more than a retreat. Also, there were rumors of a die-stacked flagship GPU they cancelled, though I didn't hear whether that was because it missed cost or performance targets.

I know that's of no help to your dilemma, but I just thought I'd add my $0.02 on what they seem to be doing.


LOL, I think you mean "supersede". I just mention that for others' benefit, because it took me a moment to figure out. Saying "super seed" in the context of talking about Super-branded cards is what threw me.

We're guessing they'll do a 24 GB RTX 5080 Super, in a year or so. Maybe sooner, if it's a 5080 Ti, but then it'll cost more.
yeah thats the one super seed or supersede same <Mod Edit> !! LOL

AMD NEED and in Bigger than CAPS NEED to compete

(i made this comment on the AMD youtube channel video i just watched )

The 5090 just showed us that $2000 is the norm now ..

Let that sink in $400 to $500usd increase over the 4090 !

I can buy almost 2 b580s for the increase alone.. ( and for all intended purposes the b580 is a decent enough card )

As a consumer thats scary that Nvidia can just increase their cards so much without competition !!

whats next 6090 $2500usd 6080 $1500usd ??

Will AMD then start to compete with the 6090 and 6080 and effectively raise prices by not actually raising prices stating its " current market prices of cards "

AMD needed the $1000usd to $1500usd 5090 beater this gen to keep prices down !!

This strategy of focusing on the mid to low tier is a bad move for not only themselves but consumers as well in the long run !!

while im not 100% what im going to do ( keep my 7900xtx longer and wait till next gen or wait till the 5080 has more then 16gb of vram or just buy the 5080 as it is ) but what i do know is that i was happy when i could pay $2000aud for my 6900xt red devil when it released and then $2100aud for my 7900xtx 2 years later with much better gains at basically the same cost !!