Review Nvidia GeForce RTX 5080 Founders Edition review: Incremental gains over the previous generation

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
AMD isn't going to be competing with this. If you're in the market for this level of performance, AMD is not going to be an option.
technically the 7900xtx is this level of raster performance. for $100-$150 cheaper. (yes i know you can find $1000 7900xtx, but most of them are selling around 850-900)

and they're not being scalped heavily like everything from team green; so you can actually get them for 850 if you bargain hunt (yes, i just said 850 was a bargain, i want to shoot myself now)

personally i think the pricing has got stupid, even 850 is obscene. maybe for a halo product like a 5090 you can justify 850... but you can't for a midranged, even upper midranged like the 5080 or 7900xtx

heck i got a 7900GRE last summer for $500 and it still bugs me that i dropped 500 on what was effectively a mid ranged card. (it's been excellent by the way, this isn't throwing shade at the card, just the pricing we have to deal with today)
 
Last edited:
How is "second fastest GPU" a positive? Its not even true, the 4090 is the second fastest GPU. This is the worst 80 series release by Nvidia for a while now. Im pretty sure the last 4-5 generations of 80 series cards have outperformed the previous generation flagship (except the 2080 vs 1080ti which was also a quite disappointing generation).
Second fastest of the new generation. Don't get too hung up on the pros and cons, they're short and hardly provide the level of nuance required. What I'm saying with "second fastest" is: This is the step down from the 5090, with all the same features but less VRAM and less performance, for half the price. By those metrics it's a better "value," but value is hardly a strong point of any GPU that costs more than $500 generally speaking.

Alternatively, just read it as "Jarred needed to come up with three positives. The fact that this is a pretty weak positive tells you there's not a lot of overwhelmingly awesome things to say about the product."
 
penultimate—you keep using that word, I do not think it means what you think it means. this wouldn't have been an issue if weren't repeatedly misused.

penultimate adjective: next to the last
https://www.quickanddirtytips.com/articles/why-penultimate-doesnt-mean-best

It means, quite literally, "almost ultimate" or "almost the best." It can also mean the second to last, in a sequence of steps, but if you're talking about rankings? Yeah, it means second-best. Which I could say instead, but from context the meaning and intent is clear.
 
Excellent work as always, @JarredWaltonGPU ! Just one question: when you tested the card on Avatar: Frontiers of Pandora, did you use the locked ("Unobtainium") settings of the game?
No, as the chart (hopefully?) indicates, I used the ultra preset. Unobtanium does look a bit better, but I'm trying to be a bit flexible with the settings I use and not just "damn the torpedoes, max out everything!"
 
Given it's on the same node as the 40xx, is it not expected that the increase in performance would be limited? In this case, does the node hold back the hardware?

I mean there's only so much nVidia can do in that space. This is not me condoning this underwhelming performance at all, but rather, more a reflection of nVidia passing this crap off as a 'new' gen. The overall performance increase is very disappointing to say the least. Not like what would be expected.

With that said, another poster in this thread made a valid point. out of all of these cards, the only one to buy is the massively overpriced 5090. Anything below is just disappointing. And the only reason for the 5090, is because it is the best GPU, but only by a hairs breadth!!!

Having only bought a 4070 Super about 3 months ago, I'm eager to see the comparison for the 5070. It will be very interesting.
I note at one point that Nvidia said nothing about an OFA for Blackwell. It seems to me that maybe the OFA was removed and that die area instead got used to provide some of the architectural enhancements. Die size for GB203 and AD103 are apparently the same, so really it's a case of rearranging resources to get higher performance, yeah.
 
  • Like
Reactions: Roland Of Gilead
These numbers are, quite honestly, disappointing; however, it was touched upon that this is most likely driver related and once the new ones are released there should be an uptick (hopefully a strong one) in performance between now and at that point. I'm still sitting just fine with my EVGA 3080 FTW card, but I'm interested so long as this makes for a suitably significant jump given the generational leap I'd get.

The AIB cards are going to probably be in the $1400 range at the low end seeing NVIDIA has announced chip shortages (in what likely is an effort to drive up prices and increase their profits just as they've done for a decade), and that might be a bit more than I'm willing to pay to make the leap. I doubt I'm the only one who feels this way.
I would expect the uptick to be more like 10~15 percent faster in most cases, with some edge cases where the extra bandwidth makes it up to 30% faster. Mostly, it shouldn't go backward and run 10% slower in anything. Even with a solid 15% improvement everywhere, though, 5080 would still be pretty underwhelming.

And of course, for 1080p where the GPUs are CPU limited in a lot of games, I'd expect the 5080 to tie the 4080 Super, rather than trailing by up to 15%. Like I said in the review, there are some oddities with the early Blackwell drivers going on.
 
Ouch, the 50 series gets more disappointing by the day. The 5080 should have tied a 4090 or been extremely close. At the performance level we were given we should have got a price cut to 899 or even 850. The 5090 should have kept the old price or at best gone with a 1699 price tag. At least then these cards would be more palatable for upgraders. As it stands now recommending someone buy a 50 series cards is difficult at best and impossible at worst. Unless your old card is smoking or VRAM starved... I don't see much motivation to spend money on these dumpster fire cards.

As always thank you @JarredWaltonGPU for your in-depth review. It sadly confirmed my worst fears about the new 80 series cards and only worsened my fears on sub 80 class cards. From what I can tell of the specs the 5070Ti or 5070 might get the best performance increases of the stack though we'll need to wait on reviews to know for sure. Nvidia has clearly gone with gimping 80 class cards AGAIN to upsell 4K gamers to even more expensive 90 class cards.

The one silver lining I see in th is generation of cards is the used market shouldn't see steep drops in the value of their cards as not much has changed. Sad days
Nvidia, sad sad days... 💩
I will say, the RTX 5070 Ti seems to be relatively decent. I mean, I don't have one, but the specs put it pretty close to the 5080 for $250 less. The 5070 with 12GB is a serious concern. So out of all the 50-series so far, the 5090 wins as the biggest and baddest hombre in the west, and the 5070 Ti is its trusty sidekick. The others are groupies.
 
??

Because 5090 exceeds the AI computation threshold for a single chip. While 5080 does not?

Here's how it works:

667b65a89acb29238bdf4fb535cc697b.png


And by the way, that's only part of the restriction, the above is just a starter. The rest is not clear.
Man, you really aren't following the conversation at all.
 
While this is expected based on the 5090 review it's still a disappointment. Sure it saves me money, but it really makes me wonder what is going to really exist for anyone spending $400 or less. This feels really bad for the health of the overall market that the biggest player doesn't feel like they need to really move the perf/$ scale much gen on gen.
I actually massively disagree here on several fronts.

First of all, the performance floor is much tighter now - you have GPUs like B580 that for a chill $250 MSRP offer you enough juice to drive most of the modern games at 60FPS at 1440p, even before upscaling tech.

You might want to adjust settings, get off that "ultra" juice, but quite frankly newer games "medium" is like "high" or "ultra" for games from just a few years ago.

And then you have AMD focusing more on mid-range, and probably low end too down the road. The sub $500 competition might actually be intense. You have 5070 at $549 MSRP, which means that the inevitable 5060Ti and 5060 will be lower and they will be more than enough to power reasonable gaming for years to come.
 
  • Like
Reactions: King_V
Is anyone really surprised? They know AMD can't compete on the high end and they aren't price competitive on the mid and low end, and they don't have the ability to generate so many fake frames, so they have absolutely zero reason to actually improve, especially when they're selling AI cards as fast as they can make them at an insane profit margin.

Though even if AMD were competitive in both performance and price, I still wouldn't buy one again until they commit to overhaul their software department, I suffered through enough of that by being loyal to them for a decade.
 
Nope, those are 4080s. There are some Supers listed around the same price. Look up 4090s. $3-4k a pop. It's crazy. I paid $1600...

Edit: just to be clear I did a very fast search on Newegg, so you could argue I didn't do enough research but I scrolled a fair bit.
Nope, I've been looking too. Can confirm this is accurate and actually has been for weeks.
 
  • Like
Reactions: JarredWaltonGPU
Is anyone really surprised? They know AMD can't compete on the high end and they aren't price competitive on the mid and low end, and they don't have the ability to generate so many fake frames, so they have absolutely zero reason to actually improve, especially when they're selling AI cards as fast as they can make them at an insane profit margin.
I really hate the "fake frames" comments from people, because usually it just means they don't fully understand computer graphics and are listening to loud pundits on why frame generation is the worst thing ever. As I've tried to point out in the review, framegen and MFG aren't inherently bad, but they're also not a 1-to-1 correlation with higher rendered framerates. A really good job at interpolating in between frames may be indistinguishable from actually rendering those frames, so in that sense they can be as "real" as the normally rendered frames.

The crux of the issue is that framegen and MFG aren't using new user input, and are in fact delaying screen updates and adding latency. There's a threshold for latency that varies from person to person. I'm generally fine with anything below ~50ms — probably because I'm no longer a teenager juiced up on caffeine and energy drinks. Someone else might want 40ms or less, and really competitive pro gamers might benefit from sub-20ms latency. I generally won't notice much of a delay or difference between 30ms and 40ms, but 30ms and 80ms is a different matter.

I think we'll ultimately get to a point where Nvidia will sample user input and warp and project frames, as it's doing with Reflex 2, to give frame generation techniques a better feel. And when that happens, people will still find things to complain about. But whether it's fully rendering or partially rendering and generating or something else, all computer graphics are "fake frames" and so it's really about not just the appearance but the feel of the games.

Nvidia is like any big corporation, and it's full of very smart and talented people who strive to create new and exciting things. That's the primary reason the company has been so successful. Basically, Nvidia is competing with itself right now. There are business reasons that Nvidia didn't go nuts with Blackwell, creating a chip on TSMC N2 or N3B with tweaks to optimize it for GPUs, etc. That will be saved for the next architecture I suspect.
 
5080 is $200 cheaper than the 4080 was. Based on the overclocking headroom Techpowerup found, it looks like Nvidia could have released a faster card at no additional cost to them. This is probably related to the China embargo. Nvidia didn't want to have to release a 5080D so the rest of the world gets stuck with this disappointment.

Also remember, the gap between the 3080 and the 3090 is MUCH smaller than the gap between the 4080 and 4090. Something that is throwing everyone off is forgetting how much faster the 4090 was than everything else which was not the case in previous generations when the 80ti was often trailing the Titan by low single digits instead of 30% faster like the 4090.

its not that deep theres a gap in there so they can milk more customers for a 5080 ti and 5080 ti super lol.
 
I feel like all the reviews are a bit all over the map. Anything from a 3 star to a 5 star depending on the site and what they compared it to. Seems to me like if you are running a 3080 or worse it's a great upgrade. Anything higher and it is debatable.

What I really think it highlights is that the 4090 was a damn great card for the price...which is a little interesting. Not sure how it will all shake out but I have a damn 10gb 3080 so it's pretty clear cut for me.
 
Nope, I've been looking too. Can confirm this is accurate and actually has been for weeks.
Yeah, RTX 4090 and 4080 are basically discontinued so anything left in the channel is just the last dregs of the supply, and prices are going up. Some places need a 4080 or 4090 because those have been validated, so they might pay more. But there are no new chips being made, I am sure — they're all going to Blackwell (both data center and RTX).

Right now, brand-new 4090 cards are starting at $2500. They're targeting the above. 4080 and 4080 Super (mostly 4080 Super as the 4080 was "discontinued" early last year when the Super arrived) can still be found at $1100 at a few places, but if you go to Newegg it's $1500+. That's because Newegg itself stopped getting new inventory for the 4080 Super months ago. Most of the cards on Newegg are now third-party sellers.
 
While this is expected based on the 5090 review it's still a disappointment. Sure it saves me money, but it really makes me wonder what is going to really exist for anyone spending $400 or less. This feels really bad for the health of the overall market that the biggest player doesn't feel like they need to really move the perf/$ scale much gen on gen.

nah this kind of happened last time with the 2000 series its a small bump up the super series are more refined models and normally offer best performance of that generation. the 6000 series we will see the proper gains. i kind of figuire nvidia is kind of like every 2 generations. we see something better.
 
I concur, but for a slightly different reason.

Come in closer everyone...I am going to tell you a secret. They are all fake! hehe
Mind blown!

Yeah. All these fake frames nonsense noise will go down with time, the tech will improve just like DLSS did and eventually there will be a way to take user input in account within those frames too.

I suspect that the next generation of consoles will be using at least 2x framegen if not more to boost visual fidelity of the games, while keeping performance and hardware prices in check, and by then many of the framegen weaknesses would probably get figured out or improved.

Console games will fall in line and will be designed with this capability in mind, everyone else will follow the suite on PC.

---

As a whole, it feels like Blackwell is a bit ahead of its time, it came out with these too hard too fast.

Many of the features it has will take years to pan out at which point there will be better suited GPUs for it too, it's sort of like Turing's tensor all over again.
 
Not sure it really matters if people feel the 40 series cards are a better value. There have been rumors for quite some time that nvidia greatly reduce the production of the 40 series chips to make 50 series chips. People will have no choice but to buy the more expensive new cards. Nvidia would be dumb to not use their equipment to manufacture the most profitable chip they can make.

We can only hope that game companies wise up and start to design their games to not require such expensive video cards.
 
We can only hope that game companies wise up and start to design their games to not require such expensive video cards.
I think "require" is a big word there.

PC games nowadays are very flexible as far as requirements go. The super duper ultra settings require major horsepower, but the very same game can be played on medium settings and that already massively reduces the requirements, while not really looking all that bad.

Game devs/publishers understand very well that their target PC/Laptop audience is overwhelmingly somewhere around ~3050 level of GPU horsepower on average give or take.
 
I really hate the "fake frames" comments from people, because usually it just means they don't fully understand computer graphics and are listening to loud pundits on why frame generation is the worst thing ever. As I've tried to point out in the review, framegen and MFG aren't inherently bad, but they're also not a 1-to-1 correlation with higher rendered framerates. A really good job at interpolating in between frames may be indistinguishable from actually rendering those frames, so in that sense they can be as "real" as the normally rendered frames.

The crux of the issue is that framegen and MFG aren't using new user input, and are in fact delaying screen updates and adding latency. There's a threshold for latency that varies from person to person. I'm generally fine with anything below ~50ms — probably because I'm no longer a teenager juiced up on caffeine and energy drinks. Someone else might want 40ms or less, and really competitive pro gamers might benefit from sub-20ms latency. I generally won't notice much of a delay or difference between 30ms and 40ms, but 30ms and 80ms is a different matter.

I think we'll ultimately get to a point where Nvidia will sample user input and warp and project frames, as it's doing with Reflex 2, to give frame generation techniques a better feel. And when that happens, people will still find things to complain about. But whether it's fully rendering or partially rendering and generating or something else, all computer graphics are "fake frames" and so it's really about not just the appearance but the feel of the games.

Nvidia is like any big corporation, and it's full of very smart and talented people who strive to create new and exciting things. That's the primary reason the company has been so successful. Basically, Nvidia is competing with itself right now. There are business reasons that Nvidia didn't go nuts with Blackwell, creating a chip on TSMC N2 or N3B with tweaks to optimize it for GPUs, etc. That will be saved for the next architecture I suspect.

But when does using "AI" to generate extra frames and using "AI" to render at a lower quality and then upsample just become cover for the fact that GPU manufacturers cannot keep up with the demands of modern games, mostly because they care more about the far more lucrative datacenter market, and they have to resort to tricks to make them seem like they can? This is a high end card costing in excess of $1000, yet without that "AI" it's a 4K60, 1440P120 card. Go back to 2018 with the RTX 2080 and you essentially get a 4K60 1440P120 card that cost $800. In 7 years all we got is a card that costs 25% more yet has the same level of performance.

Now, I'm not against "AI" trickery as a whole, it's a tool that has the capability to greatly increase the experience of consoles, portables, and other areas where cost or power is a factor, as well as greatly extend the useful life of a piece of hardware (unless you're a Turing owner, but we got the shaft big time), but I fear it has become much more of a crutch so new hardware immediately relies on that crutch, perhaps more than ever this generation (with nVidia, likely AMD too) with such a minor increase in generational performance as to be non-existant.


https://www.tomshardware.com/reviews/nvidia-geforce-rtx-2080-founders-edition,5809.html
 
AMD isn't going to be competing with this. If you're in the market for this level of performance, AMD is not going to be an option.
If the 9070xt is close to the 4080/5080, but much cheaper - then AMD is an excellent option.

Don't need the flagship. Just need good performance for a really good price.
 
Don't worry guys, the 5070 is 4090 levels of performance. Heh.

Thanks for the review as always, Jarred.

Not shocking as the alarm bells were very loud in that CES presentation.

The funny thing here is: even with these releases from nVidia, AMD will still fumble the ball. If they don't, well, we all win. At least the price-conscious crows who doesn't care about CUDA/AI and just games.

You know, a thought just came to me: AMD should revive XFire and get back into that arena. The artifacting from multi-GPU setup was, in hindsight, pretty comparable to the shenanigans with FrameGen. If people is still happy with FrameGen, I don't see why multi-GPU can't make a comeback. "But DX12 broke it!", you say? It's about the same effort of convincing any developer to use your proprietary garbage libraries for whatever "acceleration" in the graphical pipeline. More drivers headaches, but we can use AI to clean the frames and keep consistency now, right? Right.

Just a thought.

Regards
 
  • Like
Reactions: Peksha
Status
Not open for further replies.