Review Nvidia GeForce RTX 5080 Founders Edition review: Incremental gains over the previous generation

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I actually massively disagree here on several fronts.

First of all, the performance floor is much tighter now - you have GPUs like B580 that for a chill $250 MSRP offer you enough juice to drive most of the modern games at 60FPS at 1440p, even before upscaling tech.
As much as I like the B580 when used with any CPU other than the fastest on the market it's basically a 4060 with more VRAM for $250 (if you can find it at this price). It's still slower than a 3060 Ti which is a $400 card from just over 4 years ago.
You might want to adjust settings, get off that "ultra" juice, but quite frankly newer games "medium" is like "high" or "ultra" for games from just a few years ago.
What relevance does this have at all? You can do this with any video card. This completely ignores the actual hardware performance gains being worse. Say just turn down the settings only benefits the hardware vendors who are keeping their margins while offering less of an improvement.
And then you have AMD focusing more on mid-range, and probably low end too down the road. The sub $500 competition might actually be intense. You have 5070 at $549 MSRP, which means that the inevitable 5060Ti and 5060 will be lower and they will be more than enough to power reasonable gaming for years to come.
This is all pure guesswork and based on what exists now there's no reason to think there will be good value for $400 and less. 50 series will certainly be better than 40 series, but 40 series wasn't good here in the first place. Even the 30 series wasn't a good value in this market segment since the 3060 and 3050 were bad value compared to the rest of the lineup.
nah this kind of happened last time with the 2000 series its a small bump up the super series are more refined models and normally offer best performance of that generation. the 6000 series we will see the proper gains. i kind of figuire nvidia is kind of like every 2 generations. we see something better.
You're ignoring that they already did this with the 40 series and we're getting another small bump but this time they're not increasing the MSRP gen on gen (at least with the cards that have been announced). With the designs we've seen so far along with the performance of the 50 series that have been reviewed it's looking like the 50 series will be about as big a jump from 30 series as 30 series was over 20 series.

MSRP:
4060 - $300
3060 - $330
2060 - $350

1080p: 4060 ~18% > 3060 ~23% > 2060
1440p: 4060 ~15% > 3060 ~27% > 2060

MSRP:
4060 Ti - $400
3060 Ti - $400
2060 Super - $400

1080p: 4060 Ti ~18% > 3060 Ti ~33% > 2060 Super
1440p: 4060 Ti ~11% > 3060 Ti ~38% > 2060 Super

MSRP:
4070 - $600
3070 - $500
2070 - $500

1080p: 4070 ~13% > 3070 ~51% > 2070
1440p: 4070 ~27% > 3070 ~54% > 2070

(data from TPU reviews as they tend to do more historical)
 
Last edited:
This is all pure guesswork and based on what exists now there's no reason to think there will be good value for $400 and less. 50 series will certainly be better than 40 series, but 40 series wasn't good here in the first place. Even the 30 series wasn't a good value in this market segment since the 3060 and 3050 were bad value compared to the rest of the lineup.
What guesswork? Nvidia literally announced $549 MSRP for 5070.

It only goes down from there.

In the end what matters is the end result, there will be plenty GPUs in $400 area that will be very capable for most gamers that do not live in la la land cranking up visuals like crazy in their games.
 
  • Like
Reactions: JarredWaltonGPU
Its 24 GBs of VRAM, sure are enticing. Its Ray Tracing performance, though? That's another story.
I have zero interest in RT or fake frame right now.

It's time for a platform upgrade, I was waiting to see what happened with the 5080 to make a decision.

Currently 12600kf + 3080 hydro. Going to upgrade to 9800X3D + 7900 XTX + Alphacool WB and backplate.
 
Nope, those are 4080s. There are some Supers listed around the same price. Look up 4090s. $3-4k a pop. It's crazy. I paid $1600...

Edit: just to be clear I did a very fast search on Newegg, so you could argue I didn't do enough research but I scrolled a fair bit.
Those are the prices off Newegg and Amazon, I'm not trusting ebay or FB marketplace. I live near a Micro Center and they've been sold out 4080 and 4090 for awhile but have tons of 7900 XTX.

I would of stayed with nVidia if the 4080 Super was still MSRP or the 5080 wasn't 1100-1300. But 85-95% the performance at 60% the price is just too much.
 
  • Like
Reactions: UnforcedERROR
>Well Series 60 will benefit from a new process, so that one should be a good bump by default.

That will only be true for the 6090, not for the rest.

Starting with 4090, Nvidia GPU line-up has bifurcated into two distinct brackets: the xx90, which has morphed into a prosumer class, and is unconstrained by price, and the lower-tier cards which have to fit within specific price ranges for consumers.

Without price contstraint, 6090 will get the best available tech & specs. Price/perf will likely be constant, ie 50% perf gain will likely beget 50% price bump. This is fine, as high demand for 4090/5090 has shown that price isn't a concern in this bracket.

For lower-tier cards that need to hit specific price points, perf isn't determined by process node, but by how many cores are allotted. For 40- and 50-series, HW perf gain for low-tier has been relatively consistent at the 10-15% over previous. Even if the node shrink were hugely performant, Nvidia would just cut core allocation to fit the 10-15% range.

The 50-series' performance has followed the same trend as 40-series. In that, price/perf for every card in the line-up has been very predictable. 60-series will follow suit, as external conditions will be largely the same--AI will still be big, and there'll still be little or no competition, (and TSMC N2 will cost an arm and a leg.)

RDNA 4 won't be the white knight some are hoping for. AMD has been reticent is releasing info about it, which says that perf gain isn't remarkable. If it were, marketing would've done its job and shout it from every rooftop. Secondly, launch is delayed for 2 months, which says that AMD is pegging RDNA 4 price/perf to Nvidia price/perf, same as with previous gen. AMD will have slightly better bang/buck, as usual. But those hoping for "disruptive" pricing are setting themselves up for (yet) another disappointment.

RDNA 4 is the end of the line. End-of-line products aren't where you'd be making big moves. That'll come with UDNA in '26, where AI/GPU accelerator should come into play, and gives AMD the best chance to compete. Win12 should be out by then, with many AI-dependent features.

Ditto above for Intel. Intel didn't jump into the GPU game just to eke out a few pennies at the low-end of what is a small market (relative to AI). Both it and AMD want to get a piece of the upcomging edge AI (edge computing for AI) pie.
 
The 50-series' performance has followed the same trend as 40-series. In that, price/perf for every card in the line-up has been very predictable. 60-series will follow suit, as external conditions will be largely the same--AI will still be big, and there'll still be little or no competition, (and TSMC N2 will cost an arm and a leg.)
There have been reports that Nvidia is considering moving back to Samsung 2nm for their next gen because of how expensive TSMC 2nm is.
 
I have zero interest in RT or fake frame right now.

It's time for a platform upgrade, I was waiting to see what happened with the 5080 to make a decision.

Currently 12600kf + 3080 hydro. Going to upgrade to 9800X3D + 7900 XTX + Alphacool WB and backplate.

That's a sweet rig you 're putting together right there!

However, since 5080 is such a disappointment, why don't you opt for 5090 instead?

Coming from 3080, the upgrade would be massive, and you'd be set up for years to come.

It would be a much better fit for your 9800X3D, and, besides, 7900 XTX is practically a 3-year-old GPU at this point.

Sorry to bother you, i'm just asking out of sheer curiosity.
 
That's a sweet rig you 're putting together right there!

However, since 5080 is such a disappointment, why don't you opt for 5090 instead?

Coming from 3080, the upgrade would be massive, and you'd be set up for years to come.

It would be a much better fit for your 9800X3D, and, besides, 7900 XTX is practically a 3-year-old GPU at this point.

Sorry to bother you, i'm just asking out of sheer curiosity.
Coz 5090 bakes the 9800X3D and fuels the greed of NVIDIA at 3k street price? At this point I might go back to PS5 for gaming..
 
Coz 5090 bakes the 9800X3D and fuels the greed of NVIDIA at 3k street price? At this point I might go back to PS5 for gaming..
I wouldn’t go for the XTX but 5090 is not a viable alternative to someone who wants to buy a 7900XTX at all. Buy an XTX at a reasonable price and upgrade once either Nvidia or AMD release a card that is worth a damn without an astronomical price tag.
 
TH had to look for positives really hard. At this point, it is very apparent, that nVidia wants to allocate as few wafers as humanly possible to this low-margin consumer garbage.
5070 should have been a 16GB card and 5080 24GB. At least a bit of something, man.
 
  • Like
Reactions: Peksha
im personally not really impressed with the 5080s increase ..

Think im going to wait till march and see what the 9070xt delivers..
from the leaks ive seen its basically 4080 performance

Now if the 9070xt's RT ( not that im a big RT person ) is good and im going to get 4080 performance in both RT and raster at a cheaper price lets say the 650usd 700usd price tag ..

I may swap out my 7900xtx for the better RT GPU

All still up in the air as until i see reviews for the 9070xt im not 100% sold on anything yet ..

But for the 1000usd that i normally pay for my top of the range AMD card for the last 2 gens im not seeing value in the poor uplift over the 4080 super to warrant a 5080 purchase !!

If i had seen 4090 performance ( dont care what people say the 5080 should be a direct replacement for the 4090) then i may have been more interested
 
Last edited:
I really would not go for 7900XTX given AMD's FSR4 statements. That's investing in dead end tech there.
Yea, but then given how their press for the XTX was and what they ended up being, as someone who liked the Radeon brand (my first GPU was a 9700pro) I won’t have high hope, and IF the 9070 XT is really delivering some good performance, I bet the bean counters will price it at 1199 and call it a bargain
 
>5070 should have been a 16GB card and 5080 24GB.

Nvidia will continue to be stingy with VRAM size, given its fear of consumer GPUs cannibalizing sales of higher-margin AI parts. Even within the 50-series, 5080 24GB would definitely take away sales from 5090 32GB, with respect to AI use. It's the classic innovator's dilemma.

The irony is that AMD doesn't have this concern, as its AI products aren't in demand.

VRAM size will be Nvidia's weak point for upcoming GPU/AI products. VRAM amount is important for AI, more than it is for computer graphics. That's what AMD/Intel will likely leverage, higher VRAM allocation, to offset Nvidia's CUDA edge.
 
Oh the fun hasn't even started yet.
-Scalpers doing their scalpin' thing.
-Incoming tariff hikes wherever in the world.
-TSMC will do their hiking again at some stage.
-AIB's adding their design costs.
-Country import costs wherever applicable.
And so on.... Picture this whole thing in a few months from now. Ask yourself if 9.5/10 for a 5090 at double its MSRP is still worthy. Those that hoped to get it at MSRP are now relegated down to a 5080 instead. And the 5080 crowd gets relegated down to the 5070Ti. And so on.
A decade ago, people were excited over GPU reviews on flagships because people actually rushed out in excitement to go BUY them. Tom's website would take traffic hits with pages loading slowly and then reporting about it. Those were great days. Today, the excitement has died to new lows with PC gamers taking the biggest hit.
Huang isn't PC oriented anymore. He should go work in an AI lab or somewhere and give the PC segment over to someone that wants to actually fix the dire state that PC gaming is heading towards.
 
That's a sweet rig you 're putting together right there!

However, since 5080 is such a disappointment, why don't you opt for 5090 instead?

Coming from 3080, the upgrade would be massive, and you'd be set up for years to come.

It would be a much better fit for your 9800X3D, and, besides, 7900 XTX is practically a 3-year-old GPU at this point.

Sorry to bother you, i'm just asking out of sheer curiosity.
The 5090 real price is insane, it's even worse then the 5080.

Upgrading my platform requires draining the system and performing a few hours worth of maintenance. Every few years I do a major overhaul and I pick out the technology that will last the next 3ish years. I never go with the overpriced "halo" product that ends up scalped and unavailable due to FOMO. The Titan line has always had terrible price/performance, it was the 80 or 70 models that gave the best bang for the buck, and nVidia gutted that to push people towards their Titan. I skipped the 40 series precisely because of this dumbness.

Platform needs upgrading as the 12600 is getting long in the tooth, so it's getting replaced and the 9800x3d is reasonably priced for it's value. Just was waiting to see what nVidia did with the 5080 before chosing a GPU and seeing it tie or slightly beat the 7900 XTX was just sad. The 7900 XTX MSRP was $999 and now can easily be found at $800~$900. The math just don't math for buying a 5080 or5090.
 
Last edited:
I really hate the "fake frames" comments from people, because usually it just means they don't fully understand computer graphics and are listening to loud pundits on why frame generation is the worst thing ever. As I've tried to point out in the review, framegen and MFG aren't inherently bad, but they're also not a 1-to-1 correlation with higher rendered framerates. A really good job at interpolating in between frames may be indistinguishable from actually rendering those frames, so in that sense they can be as "real" as the normally rendered frames.

The crux of the issue is that framegen and MFG aren't using new user input, and are in fact delaying screen updates and adding latency. There's a threshold for latency that varies from person to person. I'm generally fine with anything below ~50ms — probably because I'm no longer a teenager juiced up on caffeine and energy drinks. Someone else might want 40ms or less, and really competitive pro gamers might benefit from sub-20ms latency. I generally won't notice much of a delay or difference between 30ms and 40ms, but 30ms and 80ms is a different matter.

I think we'll ultimately get to a point where Nvidia will sample user input and warp and project frames, as it's doing with Reflex 2, to give frame generation techniques a better feel. And when that happens, people will still find things to complain about. But whether it's fully rendering or partially rendering and generating or something else, all computer graphics are "fake frames" and so it's really about not just the appearance but the feel of the games.

Nvidia is like any big corporation, and it's full of very smart and talented people who strive to create new and exciting things. That's the primary reason the company has been so successful. Basically, Nvidia is competing with itself right now. There are business reasons that Nvidia didn't go nuts with Blackwell, creating a chip on TSMC N2 or N3B with tweaks to optimize it for GPUs, etc. That will be saved for the next architecture I suspect.
While I agree that some people don't understand the intricacies, some of us use "fake frames" as short hand A) because we don't want to write two paragraphs of nuance every time to comment on a post and B) calling FG frames "fake" is the anti-marketing pushback to Nvidia marketing them as equivalent to traditionally rendered frames.
 
Last edited:
There was a $400 gap between the 4080 and 4090. How did Nvidia use that to milk customers? They released no cards between those two and in fact widened the gap with the Super refresh by dropping the 4080 Super's price by $200.

so your logic is there not going to try to rip off customers because they didnt last generation oh wait a minute they tried that and got caught with there pants down trying to release a 4070 ti as a 4080. then unlaunched it after backlash.

nvidia had they got away with it last time would have had another card in that stack.

they learned from last time release the crappy one so no one can compare them. and im betting we will see a card in-between these 2 down the road.
 
Ray tracing, on the other hand, sees another doubling in ray/triangle intersection calculations, and Nvidia says the 5080 offers 170.6 teraFLOPS of RT compute, compared to 121 and 113 teraFLOPS of RT on the 4080 Super and 4080, respectively.

@JarredWaltonGPU Any thoughts on why with 2X the triangle calculations and +50% RT compute the 5080 is only 10% better at ray tracing?

After CES highlighted this difference in the 5000 series, I was expecting an improvement in RT but it seems pretty linear to the 10% better rendering.
 
>5070 should have been a 16GB card and 5080 24GB.

Nvidia will continue to be stingy with VRAM size, given its fear of consumer GPUs cannibalizing sales of higher-margin AI parts. Even within the 50-series, 5080 24GB would definitely take away sales from 5090 32GB, with respect to AI use. It's the classic innovator's dilemma.

The irony is that AMD doesn't have this concern, as its AI products aren't in demand.

VRAM size will be Nvidia's weak point for upcoming GPU/AI products. VRAM amount is important for AI, more than it is for computer graphics. That's what AMD/Intel will likely leverage, higher VRAM allocation, to offset Nvidia's CUDA edge.

It's not so much VRAM size but memory bus width, they are trying to avoid repeating the "mistake" they made with previous generations. Being stingy on memory bandwidth has the side effect of also being limited on VRAM size due to GDDR6 and 7 being at 16Gb (2GB) per 32-bit chip.

Think back to the previous series where you had obscene overpriced halo "Titan" product, then the high end xx80 and mid range xx70 models which were far more reasonably priced. The "I MUST HAVE THE BEST" people would get the Titans but most everyone else would buy into the 80 model or the occasional 70 or 70 ti mid range card. The value proposition was just that much better. Then the mining craze happened and people had no choice but to pay Titan prices for those previously cheaper models. That fact was not lost on nVidia and for the 40 series they baselined price/performance at their Titan model (90) then scaled it down for all the lower models. To ensure the 80 and 70 models didn't value compete with the 90 model they chopped their memory bus and compute units down. They are just following the same tactic with the 50 series. Once Samsung 24Gb (3GB) memory modules become available I suspect a mid range Super or Ti refresh to happen and you'll see memory go up along with price to maintain the same price/performance ratio.
 
  • Like
Reactions: Jagar123
While I agree that some people don't understand the intricacies, some of us use "fake frames" as short hand A) because we don't want to write two paragraphs of nuance every time to comment on a post and B) calling FG frames "fake" is the anti-marketing pushback to Nvidia marketing them as equivalent to traditionally rendered frames.

We use fake frames because it is fake frames.

There is no special quantum blockchain AI crystal chemtrails happening here. It's just frame interpolation pure and simple. Render one frame, store it, render a second frame, store that, then interpolate 1~3 intermediate frames, aka motion blur, then push the whole thing to the output buffer while using the last frame as the first frame for the next set.

The only reason it's even being recognized as "frame" is because the drivers are doing a screen call to render and that is what benchmarking programs look for to count as a unit of "FPS". It would be like trying to argue to treat upscaling and native rendering as "equal", they are not.

Now if we want to discuses those features as QoL enhancements for performance impaired situations, then great. Interpolation can aid in making a low FPS situation appear more "smooth". Upscaling can assist in making a low FPS situation feel "better" by rendering it lower resolution. Combining them is rending at lower resolution and interpolating the results to produce a smoother experience then what would otherwise be possible. Discussing those in the context of the Titan or high end 80 model is kinda silly. Instead those technologies mainly assist in pushing the 50, 60 and 70 models to punch above their weight class.