News Nvidia announces RTX 50-series at up to $1,999

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
So launch pricing wise $400 increase for the 90, same price point for 80 and then dropped $50 off the 70 Ti and 70.

I do have to wonder how long it will take for buyers to have enough of nvidia locking software upgrades behind hardware generations. Both AMD and Intel have had more customer centric approaches though it does sound like AMD's new FSR might have a hardware limit. So long as Intel is able to continue leveraging XMX they should be able to maintain backwards compatibility across generations. Hopefully whatever AMD is doing will be similar in that they can easily carry it forward.

While this generation still doesn't seem to be fast enough to make me want to upgrade at least they're not raising price again. The announcement press release from nvidia said 80/90 this month and 70/70 Ti February.
Exactly
 
Yeah ... we'll have to see what numbers GN and JTC have to figure out whats worth what. It still looks like they are pulling the same trick from the 40 launch and using the price / specs of lower cards to upsell the consumer to the xx90 model.

All that DLSS, AI, magic crystals BS isn't important and doesn't matter, only the raw rendering horsepower.
 
What’s crazy is $2K is absurd, but everyone is happy about this pricing. Talk about mindshare … In the words of a wiseman, DJ Khalid, “and another one”
I mean let's be real 5090 is not a card for general audience.

People are "happy" because the rest of the offerings have actually reasonable MSRP given Nvidia is practically a monopoly in that weight category.
 
All that DLSS, AI, magic crystals BS isn't important and doesn't matter, only the raw rendering horsepower.
I disagree here, now all 3 retail GPU vendors rely on AI one way or another here. It is pretty clear that this is going to be the way forward, whether someone likes it or not. We're not going back to brute force rendering.

And I think it's a good thing, because it indeed will allow to offer visuals at performance that would otherwise be unfeasible for a desktop PC. Yes, they may introduce artifacts and such, but it will get better with time and the visual fidelity uplift will be well worth the occasional downsides.
 
I disagree here, now all 3 retail GPU vendors rely on AI one way or another here. It is pretty clear that this is going to be the way forward, whether someone likes it or not. We're not going back to brute force rendering.

And I think it's a good thing, because it indeed will allow to offer visuals at performance that would otherwise be unfeasible for a desktop PC. Yes, they may introduce artifacts and such, but it will get better with time and the visual fidelity uplift will be well worth the occasional downsides.
I see you've already onboarded, for free, into your vocabulary what nVidia commanded: "brute force rendering".

Christ... Where to even start... Let me get the cringe reaction out of the way first...

What you're saying is akin to accepting the vision of the artists behind a game will no longer matter, because with AI, we'll just get approximations of what the "idea" behind it is. We'll lose nuance and visual cues on what we'll get as a result. And this is not even taking into account the common experience for all users. If you're creating experiences "on the fly", then the consistency goes out the door. A few years back, the fight was all-in on "consistency" for experiences across vendors, but looks like the nVidia brainwashing machine has successfuly convinced the new generations that AI, with their inconsistent delivery, is the way forward.

We used to bash AMD and nVidia when they had small differences in image quality, but now we celebrate them because of the "promise of more frames"? Again: Christ... We're hopeless.

Regards.
 
What’s crazy is $2K is absurd, but everyone is happy about this pricing. Talk about mindshare … In the words of a wiseman, DJ Khalid, “and another one”
Whether by MSRP or market conditions, Nvidia's flagship gaming GPU has been selling for over $2000 for more than 6 years now going back to the Titan RTX released in 2018. This price range isn't a new revelation and people in this market segment have grown accustomed to the sticker shock. It's just weird that people act like high end PC gaming just suddenly got expensive every time a new generation of cards is released. Halo gaming GPU's have never had mainstream pricing going all the way back to the 3dfx days.
 
Whether by MSRP or market conditions, Nvidia's flagship gaming GPU has been selling for over $2000 for more than 6 years now going back to the Titan RTX released in 2018. This price range isn't a new revelation and people in this market segment have grown accustomed to the sticker shock. It's just weird that people act like high end PC gaming just suddenly got expensive every time a new generation of cards is released. Halo gaming GPU's have never had mainstream pricing going all the way back to the 3dfx days.
Except even the mainstream cards are too expensive. Nvidia is making the same plays as the auto industry and pricing themselves out of the market.
 
  • Like
Reactions: Jagar123
Far Cry 6 is native rendering. Looks like maybe 25-30% faster for the 5090. Same thing for the 5080. Should be in the ballpark of the 4090. 4090 was about 25% faster at 4k compared to the 4080.
f0c3185fe60adffa3738325f5aa254a12b6fb4c210d1e981de8b5751d576f28a.png
Looks closer to 25% than 30%. Also, there's a good chance that the boost comes from ray-tracing optimizations, not raw power, as the slide also mentions RT.
 
  • Like
Reactions: Jagar123
I see you've already onboarded, for free, into your vocabulary what nVidia commanded: "brute force rendering".

Christ... Where to even start... Let me get the cringe reaction out of the way first...

What you're saying is akin to accepting the vision of the artists behind a game will no longer matter, because with AI, we'll just get approximations of what the "idea" behind it is. We'll lose nuance and visual cues on what we'll get as a result. And this is not even taking into account the common experience for all users. If you're creating experiences "on the fly", then the consistency goes out the door. A few years back, the fight was all-in on "consistency" for experiences across vendors, but looks like the nVidia brainwashing machine has successfuly convinced the new generations that AI, with their inconsistent delivery, is the way forward.

We used to bash AMD and nVidia when they had small differences in image quality, but now we celebrate them because of the "promise of more frames"? Again: Christ... We're hopeless.

Regards.

They've already drank the "AI" kool-aid, nothing can be done now. Even the word itself "artificial intelligence" is complete and utter BS, there is no intelligence and nothing is artificial. It's purely a buzzword that came into action because someone used stoichiometry on the English language for a chat bot. At it's root, it's merely pattern based prediction. Nothing magic, nothing special and it's been around for decades. The only thing that is "new" is that someone was able to successfully implement the backpropagation algorithm in CUDA and then scrape the internet and feed it everything to create a stoichiometric parrot.


For sure an amazing feat, this method can be used to create fairly accurate guess's on what comes next. It's a very complicated system to answer a very basic question, solve for future Y when past is X.

And none of that, absolutely none, is being used in "rendering". Instead they are using an upscaling algorithm that was already built ahead of time by nVidia. In order for it to be "AI" it would have to first build a model based on all previously rendered frames of the game, that is very computationally expensive.
 
They've already drank the "AI" kool-aid, nothing can be done now. Even the word itself "artificial intelligence" is complete and utter BS, there is no intelligence and nothing is artificial. It's purely a buzzword that came into action because someone used stoichiometry on the English language for a chat bot. At it's root, it's merely pattern based prediction. Nothing magic, nothing special and it's been around for decades. The only thing that is "new" is that someone was able to successfully implement the backpropagation algorithm in CUDA and then scrape the internet and feed it everything to create a stoichiometric parrot.


For sure an amazing feat, this method can be used to create fairly accurate guess's on what comes next. It's a very complicated system to answer a very basic question, solve for future Y when past is X.

And none of that, absolutely none, is being used in "rendering". Instead they are using an upscaling algorithm that was already built ahead of time by nVidia. In order for it to be "AI" it would have to first build a model based on all previously rendered frames of the game, that is very computationally expensive.
All this, agreed. And in regards to gaming and the touted features, 9 games out of 10 they don't work correctly or reduce visual quality markedly. I have to question the visual acuity of the supporters of these features. If these features worked every time all the time, and were not developer dependant I would wholeheartedly support them, alas, I do not.
 
So launch pricing wise $400 increase for the 90, same price point for 80 and then dropped $50 off the 70 Ti and 70.

I do have to wonder how long it will take for buyers to have enough of nvidia locking software upgrades behind hardware generations. Both AMD and Intel have had more customer centric approaches though it does sound like AMD's new FSR might have a hardware limit. So long as Intel is able to continue leveraging XMX they should be able to maintain backwards compatibility across generations. Hopefully whatever AMD is doing will be similar in that they can easily carry it forward.

While this generation still doesn't seem to be fast enough to make me want to upgrade at least they're not raising price again. The announcement press release from nvidia said 80/90 this month and 70/70 Ti February.
Can't wait to see what the scalpers do to that $400.
 
  • Like
Reactions: TeamRed2024
This is a far better way to announce your new products. Actual data. Statistical improvements, prices, and strategies we can see. I'm glad to see Nvidia pushing for refining alternative approaches to the way things have been done the last 20 years.

Lots to take in here! As a current 4090 owner, I'm not sure yet if this will justify an upgrade, as outside of gaming, I don't really use many AI related features that I could benefit from the new 5090. Perhaps if the Topaz AI suite, Adobe Suite (Local AI, not cloud based - I don't like sharing my projects to the cloud), Davinci, etc utilized it in a more functional way, I could be sold. It's rare that I use ~24GB of VRAM (mostly local GPT experiments), but common I use more than 16GB. I understand why high-end users want that higher bandwidth 32GB VRAM. Gotta push my game.

Looking forward to more data, benchmarks, and future developments! Great presentation!
If you're running 4K, Ray-tracing, DLSS, you needs lots of VRAM.
 
I just don't know who Nvidia is producing video cards for when AAA game studios are out there saying that those kinds of games don't lead to more sales. Meaning, people are playing less graphics heavy games.

Who is Nvidia's customer then? Do you need a 5090 for Minecraft or Roblox?

https://www.tomshardware.com/video-...-graphics-cost-too-much-to-make-for-aaa-games

Good point.

From D4 to MSFS2024 there's nothing on my PC that my 4090 can't handle. Do I need a 5090? Not really. But what if I want to experience CP2077 in full PT/RT Overdrive yada yada yada glory?

Don't care. 🤣

I'll definitely upgrade to the 5090 and sell the 4090 to offset some of the cost... but I'll be doing it at MSRP ($2000?) and not scalper prices. Will it happen in 2025? We'll see what the scalpers do.
 
TDP 575 Watt... Over here the electricity cost comes to some 2 Euro for every 5 hours at max use. So at an (somewhat intense) average of 150 hours a month, that comes to some 60 Euro a month, or 720 Euro a year.

Well, an initial cost of 2,400 Euro (incl. VAT), I could live with, at 50 Euro per month, counted for 4 years (and not having a car, ...). But 50+ Euro on top of that every month, that's where it gets expensive for me. Let alone, if next gen in 2 years has a better performance to TDP ratio (or 6080 is near par with 5090 - i.e. the 4070 Ti has better performance than the 3090 Ti). So, even before benchmarks, that's already a pass on the 5090 for me. :)
1 Euro for 1 Kwh basically? WOW. I pay $0.19 per Kwh (1 Kwh is basically if you use 1000 Watts for 1 hour)
 
I called it on $1,999 MSRP for 5090. Like, why not? How many less people will buy it vs. shaving off a few hundred dollars? We already agree that it doesn't make business sense for nVidia to use precious advanced node space and lose margin on gaming GPU's vs. AI GPU's.

I mean, IMO, it'd really be more appropriate to call it 5090 Ti since it has a 512-bit memory bus. I know that doesn't mean anything else is proportionately greater like die size, CU's, RT and Tensor cores, etc., but it'd be easier to sell the 5090 Ti name at 2 grand rather than a "vanilla" 5090. In fact, leaving the 384-bit mem bus gaming GPU option wide open is still a curious thing for me.

Lastly, it's too bad for you folks upgrading from a 4090 as you're missing out on the joy that comes from skipping a generation. That might sound silly, but I have to assert that there's real value on that "bigger jump experience." I get it as an enthusiast wanting (*cough* thinking you have to have *cough) the latest and greatest, but unless you're actually running out of VRAM on the 4090 (and even then), are you really being constrained?
 
Last edited:
They've already drank the "AI" kool-aid, nothing can be done now. Even the word itself "artificial intelligence" is complete and utter BS, there is no intelligence and nothing is artificial. It's purely a buzzword that came into action because someone used stoichiometry on the English language for a chat bot. At it's root, it's merely pattern based prediction. Nothing magic, nothing special and it's been around for decades. The only thing that is "new" is that someone was able to successfully implement the backpropagation algorithm in CUDA and then scrape the internet and feed it everything to create a stoichiometric parrot.


For sure an amazing feat, this method can be used to create fairly accurate guess's on what comes next. It's a very complicated system to answer a very basic question, solve for future Y when past is X.

And none of that, absolutely none, is being used in "rendering". Instead they are using an upscaling algorithm that was already built ahead of time by nVidia. In order for it to be "AI" it would have to first build a model based on all previously rendered frames of the game, that is very computationally expensive.
Leaving the marketing terminology and just analysing what they're doing with all the "AI tools" (again: leaving what you think of the term outside) is quite impressive for sure. There is a lot they can add to games with such tools not just to the experience when playing, but also development.

I'm not against nVidia or AMD including "things" into games that tap into these specific accelerators to improve the experience, but there's one thing which horrifies me: hallucinated frames. That's the one thing I do not want anything "magically" generated without consistency or intended in full by the developer and not the GPU.

I think NPC enhancement, generated locations (procedural or "randomised" on story) and tweaks to subsystems that benefit from retouching (as long as it's consistent) is fine in my book. Think also voice commands. A few games, back in the day, tried to do voice command a thing, but failed. I think with the current tools, most GPUs in the market right now can process audio in realtime to transform into commands for games which can be played like that or would benefit from. I know there's a few VR games I'd love to have real voice interactions with NPCs and features and we already have the tech for that.

You could also take all my previous rant from the perspective I want them to do something more meaningful and not just trying to make "bigger bar better" for the shareholders. That's just disgusting.

Regards.
 
The 4090 at $1599 was a very rare beast during it's whole production run, expect the same as the 5090.

I got mine at MSRP about a month after launch. Oddly enough it was recently listed for about $300 more than I paid for it.

I'll get a 5090 at MSRP at some point in 2025 I'm sure...
 
Whether by MSRP or market conditions, Nvidia's flagship gaming GPU has been selling for over $2000 for more than 6 years now going back to the Titan RTX released in 2018. This price range isn't a new revelation and people in this market segment have grown accustomed to the sticker shock. It's just weird that people act like high end PC gaming just suddenly got expensive every time a new generation of cards is released. Halo gaming GPU's have never had mainstream pricing going all the way back to the 3dfx days.
Pulling out 3dfx voodoo wasn’t voodoo when it came to pricing, a top end card might cost you an arm but not the leg too, , but this is now straight up indentured servitude. The body … then soon the mind and the soul… Faustian type buy in! I don’t know about you but the fact that your mind is asking to pay more. They’ve already captured your mind … voodoo was about 200 bucks in 2000, or $366 in today’s terms. Of course today’s GPUs are far more complex but this definitely an exponential growth curve not remotely linear!

The sad part the price looks reasonable talk about grooming!