Nvidia Blackwell and GeForce RTX 50-Series GPUs: Rumors, specifications, release dates, pricing, and everything we know

Admin

Administrator
Staff member

usertests

Distinguished
Mar 8, 2013
630
587
19,760
These generations are getting launches spaced out over a long period of time. For example, there was about 8.5 months between the launches of the RTX 4090 and RTX 4060.

So Nvidia could launch the 5090 and maybe 5080 this year, but will likely hold back lower tiers of cards to avoid driving down the prices of the existing Lovelace cards. It would be funny to see the budget cards use the 24 Gb GDDR7 chips which are not in production yet. RTX 5090 could get a 384-bit bus and 24 GB again, and a 96-bit 5060/5050 could get 9 GB.
 
  • Like
Reactions: thisisaname
@JarredWaltonGPU

Is there any chance Nvidia will stay true to its original roadmap and release RTX-50 on 2025,

https://www.tomshardware.com/news/nvidia-ada-lovelace-successor-in-2025

or are we long past that at this point?
Note that based on that image, "Ada Lovelace-Next" lands right near the 2024/2025 transition. Like, if you draw a vertical line upward from the "2025" at the bottom of the chart, and if that represents Jan 1, 2025, then having Blackwell first arrive at the end of 2024 makes sense. But even then, I think that timeline is less of a hard roadmap and more a suggestion as to how things will come out. "Hopper-Next" was announced at the beginning of March, Grace-Next hasn't been announced, and Ada-Next is still to be announced as well. Both of those may be revealed at close to the same time.

Ultimately, it comes down to demand for the products. If all the Ada GPUs (particularly high-end/enthusiast) are mostly sold out, Nvidia will be more likely to release Blackwell sequels sooner than later. And I won't name any person or company in particular, but I did speak to some folks at GTC that were pretty adamant that Blackwell consumer GPUs would arrive this year. Some people also mentioned that the Super refresh of the 40-series was "later than anticipated," but that they still didn't think it would impact Blackwell coming out this year.
TDP will decide if I upgrade. On the 1080 they got everything together. Decent size, great performance. These super high wattage cards are just ridiculous.
If we're correct that these will all be on the TSMC 4NP node, I wouldn't expect significant improvements in power. Do keep in mind that only the 4090 has very high power use, with every other GPU being relatively less power than the 30-series. Sure, Ampere was higher than the previous generations, but it also established a precedent that most people didn't appear to mind: more power, more performance.

I'd be very surprised to see TGP go down on any family for the next generation, or really anything going forward. Like whatever comes after Blackwell, let's say it gets made on TSMC 3N or even 2N (or maybe Intel 18A!), I bet it will still stick with ~450W for the 6090, ~320W for the 6080, ~200W for 6070, and ~120W for 6060. And there will be Ti/Super cards spaced between those, so 6070 Ti at ~280W as an example, and 6060 Ti at ~160W.
 

CmdrShepard

Prominent
Dec 18, 2023
461
338
560
My take for 5090 RTX knowing Jensen's megalomaniacal approach:

1. Base power will go from 450W to 600W
2. It will need second 12V HPWR because you will be able to unlock power limit to 750W (OEM cards will go up to 900W)
3. It will come with 360 mm AIO preinstalled
4. It will cost at least 50% more than 4090 RTX
5. It will be almost impossible to buy

Performance? Who cares, it has all the higher numbers so it must be faster.

Oh, and it will take 4 slots.
 
My take for 5090 RTX knowing Jensen's megalomaniacal approach:

1. Base power will go from 450W to 600W
2. It will need second 12V HPWR because you will be able to unlock power limit to 750W (OEM cards will go up to 900W)
3. It will come with 360 mm AIO preinstalled
4. It will cost at least 50% more than 4090 RTX
5. It will be almost impossible to buy

Performance? Who cares, it has all the higher numbers so it must be faster.

Oh, and it will take 4 slots.
If it comes with a AIO preinstalled that will be a 2.5 card slot at max.

The rest spot on.
 

NightLight

Distinguished
Dec 7, 2004
571
14
19,645
If we're correct that these will all be on the TSMC 4NP node, I wouldn't expect significant improvements in power. Do keep in mind that only the 4090 has very high power use, with every other GPU being relatively less power than the 30-series. Sure, Ampere was higher than the previous generations, but it also established a precedent that most people didn't appear to mind: more power, more performance.

I'd be very surprised to see TGP go down on any family for the next generation, or really anything going forward. Like whatever comes after Blackwell, let's say it gets made on TSMC 3N or even 2N (or maybe Intel 18A!), I bet it will still stick with ~450W for the 6090, ~320W for the 6080, ~200W for 6070, and ~120W for 6060. And there will be Ti/Super cards spaced between those, so 6070 Ti at ~280W as an example, and 6060 Ti at ~160W.
If only they could get that 6080 in to that 200-250w sweet spot, that would really be nice. 1080 is about 180w if I remember, so a 140w bump between generations is maybe not that extreme but still, it's a lot. I really need to start looking into idle powers soon because my card is getting up there in years...
 
Nov 24, 2023
13
6
15
My take for 5090 RTX knowing Jensen's megalomaniacal approach:

1. Base power will go from 450W to 600W
2. It will need second 12V HPWR because you will be able to unlock power limit to 750W (OEM cards will go up to 900W)
3. It will come with 360 mm AIO preinstalled
4. It will cost at least 50% more than 4090 RTX
5. It will be almost impossible to buy

Performance? Who cares, it has all the higher numbers so it must be faster.

Oh, and it will take 4 slots.
I'll be sticking with my 7900xtx. I don't want to lose sleep over my pc melting.
 

Notton

Prominent
Dec 29, 2023
540
458
760
My prediction is you will see a fully enabled GB202 with 48GB, but it'll be called RTX Titan AI, arrive in 2026 Q3, and cost all of your limbs and organs.
 
  • Like
Reactions: usertests
depends on if they gimp bus again.

thats why a 4060 was barely betetr or even worse than the 3060 even though it "should" have been betetr in every case.
Their choice of memory bus effectively downgraded it so it didnt have the generational improvement one expected.
As noted in various articles on the subject, the narrower bus is less a problem as far as bandwidth, and more of a problem with VRAM capacity. I've actually become far less anti-4060 Ti 16GB over time, mostly because the price dropped enough to make it more viable. $50 extra to double the VRAM from the base 4060 Ti, even if it has the same memory bandwidth, isn't actually a bad thing.

I've poked around at some more recent games, and the 16GB cards (7600 XT and 4060 Ti 16GB) just don't choke like the 8GB models. Which isn't to say they're awesome, just that if we could have had 12GB with the same bandwidth as the 8GB, that would have been more than sufficient. Instead we ran into VRAM limits. The problem is actually being able to clearly show this. For example:

1713718550184.png

Here the two cards are mostly evenly matched, overall, with the 8GB model (that clocks higher than our 16GB card) winning most of the games. But Spider-Man, The Last of Us, and Watch Dogs Legion show a clear benefit, and minimum fps in Diablo IV is also a significant difference — you'll get noticeable stuttering in Diablo IV with the RT Ultra settings on 8GB cards, to the point where it's actually not that playable. If I had $400 for a new graphics card right now and I wanted to go with Nvidia, I would save up for the 4060 Ti 16GB. (Well, actually, I'd buy a 4070 off eBay, but that's a different story.) I wouldn't want the 8GB 4060 or 4060 Ti... though I wouldn't actually want anything less than about a 4070 Ti Super, if we're being honest. LOL

Anyway, assuming we're correct and Nvidia moves to 24Gb chips on the lower tier parts, I will have no real concerns with future 128-bit and 192-bit configurations. Well, depending on price, obviously. A 5060 and 5060 Ti with a 128-bit interface with 12GB priced at $300~$450 should be totally fine. Nvidia could do something like 32Gbps speed on the 5060 and 36Gbps on the 5060 Ti to differentiate, or maybe 28Gbps and 32Gbps, or whatever.

The important thing is that GDDR7 would potentially give about a 50% boost to bandwidth and capacity for every bus width. As mentioned in the text, even a 96-bit interface with 9GB and 432 GB/s — it could even be clocked at 30Gbps and deliver 360 GB/s — should be viable for a "budget" card, particularly with a 32MB or larger L2/L3 cache. That's the new "budget" price of around $250~$300, of course.
 
the narrower bus is less a problem as far as bandwidth, and more of a problem with VRAM capacity.
gonna start off with: your post is about the 4060 vs its other versions.

They are all bound by the bus & the memory they have.

I am saying the bus is an issue in if you compare it to the 3060.

the 8gb vs 8gb and in many cases the last gen beats the newer gen due to that bus shafting.

it was a purposeful decision to put a larger gap between sku's.

4060 to play stuff better than 3060 effectively relies on dlss 3.0 & frame gen (which have downsides you may not always want)

every generation lately was the newer gen had performance of the last gens next tier higher....which even works on the rest of the 40 series.
The 4060 is the only outlier. it is worse than 3070 in every way and loses out to the 3060 a lot.

That shouldnt ever happen.
 
  • Like
Reactions: P.Amini
gonna start off with: your post is about the 4060 vs its other versions.

They are all bound by the bus & the memory they have.

I am saying the bus is an issue in if you compare it to the 3060.

the 8gb vs 8gb and in many cases the last gen beats the newer gen due to that bus shafting.

it was a purposeful decision to put a larger gap between sku's.

4060 to play stuff better than 3060 effectively relies on dlss 3.0 & frame gen (which have downsides you may not always want)

every generation lately was the newer gen had performance of the last gens next tier higher....which even works on the rest of the 40 series.
The 4060 is the only outlier. it is worse than 3070 in every way and loses out to the 3060 a lot.

That shouldnt ever happen.
4060 almost never loses to 3060… a few edge cases like 4K in demanding games where neither card is actually playable. It’s relatively close to the 3060 Ti. And the 4060 Ti almost always beats the 3070. So the “previous Gen +1” still mostly applies.

But the big issue with the 4060 vs 3060 is less the bus width and more the lack of a significant increase in compute and/or core counts. The 4060 was limited all around. Fewer cores (24 SMs vs 28 SMs), less bandwidth, less capacity. The same applies to 3060 Ti and 4060 Ti (same capacity but fewer cores and lower raw bandwidth). The cache makes up for the lack of bandwidth, mostly, and the higher clocks make up for the reduction in core counts, mostly. But 4070 has the same core count as 3070, while everything below that level didn’t get nearly as much of a generational improvement. (But at least the 4060 got a price cut relative to 3060 12GB.)

Note that if you compare the 4060 with the 3060 8GB, the discussion changes a lot and the 4060 is clearly faster.

So it still goes back to specs, performance, and pricing. If Nvidia keeps core counts the same for Blackwell, even with more VRAM capacity and bandwidth from GDDR7, I wouldn’t expect a massive boost in performance. But my expectation is that core counts will go up, hopefully by 25% or more for every tier.
 
  • Like
Reactions: HWOC and 35below0

zorgan.roman

Prominent
Apr 22, 2023
5
1
510
I'm not surprised that NVIDIA is rushing quickly with the 50 series. For many buyers, the 40 series was left with an overpriced disappointment. For me, I'm skipping this series because NVIDIA discouraged me with an incorrect policy, the top series will let the manufacturers cheat on the design of the card. They rob it of a high-quality 2x4 PSU 3 pcs and still hide everything behind a large heavy cooler. Silicon 4090 needs a double-sided water sprite to strengthen the card on both sides and a different power connector.
 
I'm not surprised that NVIDIA is rushing quickly with the 50 series. For many buyers, the 40 series was left with an overpriced disappointment. For me, I'm skipping this series because NVIDIA discouraged me with an incorrect policy, the top series will let the manufacturers cheat on the design of the card. They rob it of a high-quality 2x4 PSU 3 pcs and still hide everything behind a large heavy cooler. Silicon 4090 needs a double-sided water sprite to strengthen the card on both sides and a different power connector.
There's no rush or "quick" to speak of. Nvidia has had a 2-year cadence going back to the GTX 900-series in 2014. This is just sticking with that approach.

And... I'm not even sure what you're saying in the second half. Manufacturers "cheating?" "2x4 PSU 3 pcs?" "Double-sided water sprite?" Whatever translator that went through didn't quite get the message across. I'm assuming you meant triple 8-pin power connectors from the PSU (and they would be 6+2-pin, not 2x4-pin EPS12V). But the rest is gobbledygook.
 
  • Like
Reactions: HWOC

35below0

Respectable
Jan 3, 2024
1,663
701
2,090
The 5090 will generate heat and friction online, and push benchmark performance. It will sell probably around 5090 units though.

The 5060s and 5070s are the ones that will actually turn a profit. Maybe not so much for nvidia, but if EVGA's comments are anything to go by, the high end GPUs don't sell enough nor have enough of a margin to be viable for AIBs.

What i mean to say is that as much as we will argue about the 5090, not that many people will actually own the darn things.