News Nvidia announces RTX 50-series at up to $1,999

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I mean let's be real 5090 is not a card for general audience.

People are "happy" because the rest of the offerings have actually reasonable MSRP given Nvidia is practically a monopoly in that weight category.
nvidia is business onto itself in the words of Jay-Z, “I’m not just a businessman, I’m a business … man”. Driving mineral, semiconductor, energy all by itself … 575w was also a nice surprise. This lineup will agree overall is like a good concert hits on so many notes. A little something for evryone. I’d like to see the reviews I can’t believe 5070 will have close to 4090 performance. If it does that’s a nice price reduction. Assuming though 4090 will trounce it in AI workloads otherwise Houston we have a problem that card ain’t gonna sell anywhere near 549.
 
Last edited:
The 5080 does have 20% more cores, ~7% more memory bandwidth and ~7% more clock speed so it's possible (though unlikely) that it could be a third faster than the 5070 Ti if they've tweaked the numbers that carefully. I'm curious where the price vs performance scale is this generation since with the 40 series it was all based off of the 4090.
It's interesting because they don't list the cores as a comparitive. I really think going forward they are pushing DLSS the most so I actually think the bigger jump might be the AI tops which is also 30% more. So you might be right I just didn't see the diff as I used their spec comparison.

Either way it was a very good announcement that I think surprised a lot of people by sticking close to that super pricing across the range. The 5090 will dissapoint some but I think most that went that high have the money to burn.

Just a great presentation...and I really can't wait to see what Digits does long term. That to me was the bigger announcement but just in a very different vein.
 
I got mine at MSRP about a month after launch. Oddly enough it was recently listed for about $300 more than I paid for it.

I'll get a 5090 at MSRP at some point in 2025 I'm sure...
You've got a bot, blind luck, or live near a Microcenter. $1599 cards were consistently sold out at most retailers. Nvidia's website never had the 4090FE regularly available for more than a day or 2 max - and that's a year after release, usually it was gone in a few minutes.
 
  • Like
Reactions: TeamRed2024
What does that even mean? I paid like $900 for my 3080 12GB.

And you overpaid. The 3080 10gb was $699, and came out nearly 4.5yrs ago. You paid $200 more for extra 2gb vram. The shortages allowed Nvidia to be greedy and charge more. AMD is also guilty of this, just to a bit lesser degree. Their RX 7000 launch prices have been much higher than they should have been too.

HUB/Techspot has shown how even in RT scenarios an RX 6800 could beat an RTX 3070 due to the pitiful 8gb ram, in newer titles, when at launch that was obviously reversed. Games are getting more vram hungry. We are not far off where 12gb will be a requirement for 1080p gaming, as in some titles, even at 1080p, 8gb has shown to be a limiting factor. $550 for what is basically a 1080p card, no matter how you spin it, is just insane.



What is the current definition of mainstream?
xx60 and xx70 would be what I would consider mainstream. $550 for a 12gb 70 series card is insane.
 
Pulling out 3dfx voodoo wasn’t voodoo when it came to pricing, a top end card might cost you an arm but not the leg too, , but this is now straight up indentured servitude. The body … then soon the mind and the soul… Faustian type buy in! I don’t know about you but the fact that your mind is asking to pay more. They’ve already captured your mind … voodoo was about 200 bucks in 2000, or $366 in today’s terms. Of course today’s GPUs are far more complex but this definitely an exponential growth curve not remotely linear!

The sad part the price looks reasonable talk about grooming!
Voodoo 1 launched at $300 in 1996. Remember, it needed a 2D card to work. I don't remember what an entry level VGA card was back then, but the popular pairing then was with a Matrox Millennium and that MSRP'd at $300 as well. Cut that price in half for a lower end card and you were looking at a $450 entry price for the only card 3dfx offered. That's over $900 today. Move forward a couple years and Voodoo 2 launches at the same $300, double that for SLI ($300) and add a Matrox Millenium II 4MB($300), which was the setup I had, and you were looking at $900 in 1998 money. That's $1750 today. Want the 8MB Millenium II? That was another $100, so $1000 1998 money ($1935 today). High end PC gaming was every bit as expensive back then.
 
No, that's not what I'm asking. What settings does the mainstream gamer today game at?

Settings would depend on the games played, 1080p high I would say is probably pretty common today, as 1080p is still the most used resolution according to Steam surveys. Same surveys are showing people are moving up to 1440p, as panel prices have come way down, though.

You can get high refresh 1440p, for like $150, these days, with a 1080p being like $100. Better visually, without adding a lot of cost vs 1080p, where as 4k is like double the cost of 1440p, when looking at 144hz+ panels. Not to mention the hardware requirements for 4k are much higher.

With 8gb ram becoming a limiting factor, even at 1080p, $550 for essentially AAA 1080p gaming is insane.
 
And you overpaid. The 3080 10gb was $699, and came out nearly 4.5yrs ago. You paid $200 more for extra 2gb vram.
That's not particularly accurate as the card has more memory bandwidth and cores as well. When set to the same power profile as the 3080 Ti they both have around the same performance in everything because the 3080 clocks higher.
$550 for a 12gb 70 series card is insane.
For the 50 series I'd tend to agree with you here as the performance should be 4070 Ti+. I don't think the 4070/Super having 12GB was bad as those are the same performance category as the 3080. The 4070 Ti certainly should have had more than 12GB though.
 
You've got a bot, blind luck, or live near a Microcenter. $1599 cards were consistently sold out at most retailers. Nvidia's website never had the 4090FE regularly available for more than a day or 2 max - and that's a year after release, usually it was gone in a few minutes.

Don't have a bot... I'm in Seattle metro which has the nearest Microcenter 1000+ miles away in southern CA... so I guess we can call camping out Best Buy's website blind luck.

I won't be putting that much effort in this time though... the 3090 to 4090 jump was pretty significant. I'm quite content with the 4090 as is so if it takes 6-9 months for the 5090 to be readily available at MSRP so be it.
 
I just want to point out that there is one factual error in this article:
NV1 was not used on aracde. The arcade HW that powered Virtua Fighter 1 was Sega Model 1, which pre-dates Nvidia. My understanding is the PC port of Virtua Fighter 1 (which is also a port of the Saturn version rather than the arcade version) requires NV1 initially but DirectX support was added later
 
I see you've already onboarded, for free, into your vocabulary what nVidia commanded: "brute force rendering".

Christ... Where to even start... Let me get the cringe reaction out of the way first...

What you're saying is akin to accepting the vision of the artists behind a game will no longer matter, because with AI, we'll just get approximations of what the "idea" behind it is. We'll lose nuance and visual cues on what we'll get as a result. And this is not even taking into account the common experience for all users. If you're creating experiences "on the fly", then the consistency goes out the door. A few years back, the fight was all-in on "consistency" for experiences across vendors, but looks like the nVidia brainwashing machine has successfuly convinced the new generations that AI, with their inconsistent delivery, is the way forward.

We used to bash AMD and nVidia when they had small differences in image quality, but now we celebrate them because of the "promise of more frames"? Again: Christ... We're hopeless.

Regards.
I have news for you, I have onboarded for quite a while 3080Ti and using DLSS for years.

And why shouldn't I? It is clearly where the industry is going, so why would I sit here fighting windmills? You're writing a nice appeal to feelings, but it rings hollow.

Here's how I see this - the artist behind the game is able to bring out much more of their respective art and creativity compared to the past. Where previously they could only produce a "sketch", now they will be able to deliver a whole painting.

Does it have drawbacks and challenges to overcome? For sure, but the bottom-line result is net positive - you will be able to have visuals in games that otherwise would not be practical to offer.

Heck, even standard rendering has a lot of techniques and optimizations to it that "distort" the original "vision". I don't see you being up in arms about that. What triggers you so much? The bad word "AI"?

But hey, sure let's keep waiting for that dreamland video card with solution that offers non-AI rendering capability at performance that is even close what these GPUs will offer. That ship has sailed, buddy.
 
Settings would depend on the games played, 1080p high I would say is probably pretty common today, as 1080p is still the most used resolution according to Steam surveys. Same surveys are showing people are moving up to 1440p, as panel prices have come way down, though.

You can get high refresh 1440p, for like $150, these days, with a 1080p being like $100. Better visually, without adding a lot of cost vs 1080p, where as 4k is like double the cost of 1440p, when looking at 144hz+ panels. Not to mention the hardware requirements for 4k are much higher.

With 8gb ram becoming a limiting factor, even at 1080p, $550 for essentially AAA 1080p gaming is insane.
It was a simple question. What is a mainstream gamer today. As you referenced. Steam shows 2/3's of gamers game at 1080p or lower and 70% currently have a GPU with 8GB or less VRAM. 1080p, probably 60hz maybe creeping up to 100hz. That's a mainstream gamer.

A 4070 averages about 100fps at 1440p and 130fps at 1080p. A 5070 is probably going to be 20% faster, maybe more, which is well past mainstream performance targets. An x70 card is no longer a mainstream level card based on its performance. CPU's long ago surpassed the point of good enough for the general public. We're now seeing the same trend with GPU's, where their performance capabilities are increasing faster than the rest of the average gamer's rig is.

The best mainstream GPU currently is probably the Intel B580. Good feature set, $250, 80+fps at 1080p and 12GB VRAM which will be plenty for this target market for years. Nothing Nvidia announced yesterday was targeted at mainstream gamers.
 
  • Like
Reactions: RTX 2080
It was a simple question. What is a mainstream gamer today.
I think the better question will be what a mainstream gamer will be in 2-3 years from now.

Even the modest GPU offerings of this generation for all 3 vendors offer a major step up in quality thanks to the technologies in place.

Will everyone magically switch to this gen's GPUs this year? No. But as you yourself say - things are creeping up, where 1080p 60hz was a staple for many years as a standard, nowadays you'd be hard pressed to find anything, but the most basic of monitors doing just 1080p@60hz.

With the food comes the appetite - when in a few years a greater chunk of gamers will rock 3060/3070 level of GPUs in their rig through modern budget cards, you will see the visual fidelity in games increase accordingly. It was always the case, after all there was a time when everyone and their mothers had Geforce 2 MX or similar as mainstream in their rigs too. and somehow, we went past that too.

And the driver for this creep up certainly won't be 5090, but all these mid range or budget cards that will be sure to come up, there will be 5060, and 9060 and there B580 and who knows what other SKUs Team Blue/Red/Green are cooking up.
 
I think the better question will be what a mainstream gamer will be in 2-3 years from now.

Even the modest GPU offerings of this generation for all 3 vendors offer a major step up in quality thanks to the technologies in place.

Will everyone magically switch to this gen's GPUs this year? No. But as you yourself say - things are creeping up, where 1080p 60hz was a staple for many years as a standard, nowadays you'd be hard pressed to find anything, but the most basic of monitors doing just 1080p@60hz.

With the food comes the appetite - when in a few years a greater chunk of gamers will rock 3060/3070 level of GPUs in their rig through modern budget cards, you will see the visual fidelity in games increase accordingly. It was always the case, after all there was a time when everyone and their mothers had Geforce 2 MX or similar as mainstream in their rigs too. and somehow, we went past that too.

And the driver for this creep up certainly won't be 5090, but all these mid range or budget cards that will be sure to come up, there will be 5060, and 9060 and there B580 and who knows what other SKUs Team Blue/Red/Green are cooking up.
The average age of the top 10 desktop GPU's on Steam is 4 years, with 3 being 5 or more. The fastest card by far is a 4070 in 9th. Average is roughly around a 3060. 2 to 3 years will move us up one generation to a 4060 to 3060 Ti average. What's that, 10-20% faster? We're not looking at some dramatic upward shift in the next couple of years.
 
  • Like
Reactions: KyaraM
The average age of the top 10 desktop GPU's on Steam is 4 years, with 3 being 5 or more. The fastest card by far is a 4070 in 9th. Average is roughly around a 3060. 2 to 3 years will move us up one generation to a 4060 to 3060 Ti average. What's that, 10-20% faster? We're not looking at some dramatic upward shift in the next couple of years.
I would not be so sure, the trick here is the proliferation of AI-based techniques, especially given all the vendors are now playing ball with it, with AMD joining the fun.

It's not only that a gen step up would increase mildly raster performance, but it will also considerably increase AI compute on one hand AND will also have 2 more years of game releases and updates to introduce that in the games.

I actually think that the next two years will result in a major step up there because of that. And then you will end up having a new console generation too, which is practically assuredly going to utilize those capabilities too.
 
I'm debating whether or not Jensen, during the presentation, was joking with saying the 4090 was a good investment and everyone in the room had one inside of their $10K PCs... That was kind of shocking to hear and tells you all you need to know about his mindset.

As for the tech they showed... I'm not impressed, honestly. They're leaning too heavily on anxilliary tech for the "bigger bar better". Hallucinated frames is not what I want. That does not help in multiplayer games.

And the pricing... Still too expensive for my taste. The only card I was expecting to be higher in the MSRP was the 5080, but for some bizarre reason Jensen decided to keep it at 1K. Also, if someone really believes the 5070 will be a 4090 replacement, I feel sad for them.

Overall, not impressed, but still better than AMD and Intel combined for sure.

Regards.
So you hate EVERYTHING about the new Nvidia cards and EVERYTHING about them is a gimmick. Yet, you think they're better than the competition? 🤡
 
I have news for you, I have onboarded for quite a while 3080Ti and using DLSS for years.

And why shouldn't I? It is clearly where the industry is going, so why would I sit here fighting windmills? You're writing a nice appeal to feelings, but it rings hollow.

Here's how I see this - the artist behind the game is able to bring out much more of their respective art and creativity compared to the past. Where previously they could only produce a "sketch", now they will be able to deliver a whole painting.

Does it have drawbacks and challenges to overcome? For sure, but the bottom-line result is net positive - you will be able to have visuals in games that otherwise would not be practical to offer.

Heck, even standard rendering has a lot of techniques and optimizations to it that "distort" the original "vision". I don't see you being up in arms about that. What triggers you so much? The bad word "AI"?

But hey, sure let's keep waiting for that dreamland video card with solution that offers non-AI rendering capability at performance that is even close what these GPUs will offer. That ship has sailed, buddy.
It's not fighting windmills, at all. It's an industry trend that consumers can change. It's not like nVidia is forcing you to use DLSS or buy the most expensive video card that promises the "biggest AI evah" or turn on FrameGen.

You're adopting/accepting something, from what I gather, you don't even embrace enthusiastically? Like reluctantly "because the industry is going there, so let's just eat it silently" kind of way?

In any case, I don't have a problem with techniques based on whatever buzzword they want to use that improves the experience without sacrificing visual fidelity. I mean, how do you reconcile that, on one hand, RT which is basically real-life light bounces (or as close the calculation can get) is in need of another technique that creates frames based on image analysis with a bit of information on movement from the graphical engine, so it "guesses" what you'll do next to compensate for the heavy calculation it requires to "bring the best realism" to your screen so you get an actual playable experience? Or, in other words, life-like light vs hallucinated frame on your screen only so that the FPS metric goes up, but the visual fidelity goes down. That is the main point that bothers me: there's a stupid contradiction here and seems like a lot of people are just ignoring it, because reasons. Also, because of the reasons I just explained/illustrated, it's not a "net positive" for me.

On the other hand, I'm not daft and I see where upscaling works best for people and, to a degree, frame interpolation (not generation) can help deliver a somewhat better experience, specially for people with motion sickness due to frame pacing being bad. And like I mentioned in another post: the other good things nVidia, AMD and Intel could be doing for games, but just focus on "bigger bar better" technologies instead.

Regards.
 
  • Like
Reactions: Phaaze88
So you hate EVERYTHING about the new Nvidia cards and EVERYTHING about them is a gimmick. Yet, you think they're better than the competition? 🤡
Well, I can understand that.

These cards ARE higher performance than whatever AMD got at almost every offering, it's just a fact of life given AMD shifted focus to mainstream GPUs and almost all Nvidia showed off is high end segment there aside from 5070 maybe, which probably still is better than almost anything AMD got.

Gimmick or not - for now AMD has no answer to something like 5080 and probably even 5070Ti.
 
  • Like
Reactions: Loadedaxe
It's not fighting windmills, at all. It's an industry trend that consumers can change. It's not like nVidia is forcing you to use DLSS or buy the most expensive video card that promises the "biggest AI evah" or turn on FrameGen.

You're adopting/accepting something, from what I gather, you don't even embrace enthusiastically? Like reluctantly "because the industry is going there, so let's just eat it silently" kind of way?

In any case, I don't have a problem with techniques based on whatever buzzword they want to use that improves the experience without sacrificing visual fidelity. I mean, how do you reconcile that, on one hand, RT which is basically real-life light bounces (or as close the calculation can get) is in need of another technique that creates frames based on image analysis with a bit of information on movement from the graphical engine, so it "guesses" what you'll do next to compensate for the heavy calculation it requires to "bring the best realism" to your screen so you get an actual playable experience? Or, in other words, life-like light vs hallucinated frame on your screen only so that the FPS metric goes up, but the visual fidelity goes down. That is the main point that bothers me: there's a stupid contradiction here and seems like a lot of people are just ignoring it, because reasons. Also, because of the reasons I just explained/illustrated, it's not a "net positive" for me.

On the other hand, I'm not daft and I see where upscaling works best for people and, to a degree, frame interpolation (not generation) can help deliver a somewhat better experience, specially for people with motion sickness due to frame pacing being bad. And like I mentioned in another post: the other good things nVidia, AMD and Intel could be doing for games, but just focus on "bigger bar better" technologies instead.

Regards.
Consumers can't change anything here and you know why?

Because vast majority of them do not really care about these silly fights. I am quite certain that the majority people who fire up games don't even bother to change the video settings and furthermore - they also don't even care about squinting their eyes real hard, sticking their nose to the monitor and looking for all of them minor DLSS artifacts if there even are any.

And I'm completely onboard with that, because to be brutally honest - I'd rather have CP2077 run at 150FPS+ at 4k in ultra preset with some odd minor shimmering I need to stand still and really focus to see, than 50FPS without.

It's a simple case of what you lose vs what you get - and what you lose for vast majority of gamers is of no consequence compared to simply having a good performance on their otherwise trash GPUs.
 
Consumers can't change anything here and you know why?

Because vast majority of them do not really care about these silly fights. I am quite certain that the majority people who fire up games don't even bother to change the video settings and furthermore - they also don't even care about squinting their eyes real hard, sticking their nose to the monitor and looking for all of them minor DLSS artifacts if there even are any.

And I'm completely onboard with that, because to be brutally honest - I'd rather have CP2077 run at 150FPS+ at 4k in ultra preset with some odd minor shimmering I need to stand still and really focus to see, than 50FPS without.

It's a simple case of what you lose vs what you get - and what you lose for vast majority of gamers is of no consequence compared to simply having a good performance on their otherwise trash GPUs.
That's fair then. We can agree to disagree and I'll stop here.

Just as a final comment: I can't stand graphical glitches in any capacity and when testing both FG and upscaling it's just worsens the experience for me way too much. This is not even talking about latency for FG. This reminds me back when the first 120Hz monitors were popping and people was like "nah; 60Hz is alright", or the usual "the eye can't see more than 24 FPS", where in this case it's on the opposite extreme of the spectrum.

Regards.
 
Voodoo 1 launched at $300 in 1996. Remember, it needed a 2D card to work. I don't remember what an entry level VGA card was back then, but the popular pairing then was with a Matrox Millennium and that MSRP'd at $300 as well. Cut that price in half for a lower end card and you were looking at a $450 entry price for the only card 3dfx offered. That's over $900 today. Move forward a couple years and Voodoo 2 launches at the same $300, double that for SLI ($300) and add a Matrox Millenium II 4MB($300), which was the setup I had, and you were looking at $900 in 1998 money. That's $1750 today. Want the 8MB Millenium II? That was another $100, so $1000 1998 money ($1935 today). High end PC gaming was every bit as expensive back then.
Voodoo price dropped precipitously and sold under MSRP from the get green. You can argue it was expensive for its time but the nascent 3d and GPU market was neither mature enough or ubiquitous enough to sustain prices you see today. There was no crypto mining, no uber 4k pixel pushing requirements and certainly not an AI hotbed in the hands of mass consumption. Deep blue was what playing chess? Come now, this price gauge is an exponential growth curve driven by an expansion in use cases that GPUs are simply good at solving, the consolidation of both hardware and software around nvidia to support these use cases … no ati, no 3dfx, no … competing to bring the market to life. You have a behomoth in nvidia cornering the market and behemoths along the entire supply chain including TSMC and ASML amongst a few key players. Simply put these companies are charging what they want end of story and the circumstances allow it. Think Intel circa 91 and its monopoly on x86 and the absurdity of their pentium pricing.
 
Last edited: