News RTX 5090 may be surprisingly svelte — twin-slot, twin-fan model on the way, says leaker

From the Article:

""The leaker doesn’t share information about whether the 5090 will have lower power consumption than the 4090, but they said that the new GPU’s cooling design will be more efficient.""

FWIW, Nvidia has already been testing next-gen cooling modules for its RTX 50 "Blackwell" Gaming GPUs, conducting early testing and verification of cooling solutions and modules.

Benchlife reported on this a while back, as several AIBs have confirmed this as well. I think you guys missed this news. Benchlife reports are far more legit and trustworthy as well, than even @kopite7kimi himself.

Anyway, the important point to be noted here is that the highest wattage being tested is a 600W SKU, while the lowest wattage is a 250W SKU.

A 4-phase plan was/is in the pipeline:

On the other hand, according to information from the cooling module factory that we are familiar with, NVIDIA has been conducting relevant testing and verification of cooling modules for the GeForce RTX 50 series graphics cards based on the Blackwell GPU architecture, and has obviously started preparations for the GeForce RTX 50.

Although there is no clear time point, there are currently about 4 plans in progress, with the highest wattage being 600W and the lowest wattage being 250W.

Whether there is a chance to see NVIDIA launch GeForce RTX 50 series graphics cards in 2024 is still difficult to say clearly at this stage, but it is certain that AMD will not launch Radeon RX 8000 series graphics cards with RDNA 4 GPU architecture in 2024.


via Benchlife
 
Last edited:

Notton

Prominent
Dec 29, 2023
523
451
760
If true, it sounds like they are going with what they did with RTX 4000/5000 Quadro cards.

Quadro cards have more parts of the GPU enabled, but with lower clock speeds.
The result is lower total power consumption for a similar level of performance, but less efficient use of die space.
Apple also does the big die, lower clocks method with their M series.
 
Also, it is rumored that the RTX 5090 would feature a 3-layer PCB design. Approach this leak/info with caution though, since nothing has been confirmed officially.

The upcoming FE model would also feature a main PCB, along with an IO Rigid Board, and then feature a third board that houses the PCIe connection, but this cannot be regarded as a separate PCB.

But not clear as to whether that refers to three individual PCBs within a single card, or three unique designs of which only one should be finalized.

A dense memory layout ? Four memory modules on the top, five each on the sides, and two at the bottom. So we get a total of 16 GDDR7 DRAM modules featured on the RTX 5090 PCB.

So are we looking at a 32GB flagship SKU ?

The memory layout is very dense.
4

5 5
2


FE uses three PCBs to leave space for double-sided blowing. The 30 and 40 dovetails are single-sided blowing, you know what I mean.
Because the bit width is increased, the PCB cannot hold the memory lying down, and the staggered layout like PG651 is not used, so the full blowing method of PG137 is not inherited.
Chiphell Forums (Machine Translated)


View: https://x.com/kopite7kimi/status/1793673705329148265
 

PEnns

Reputable
Apr 25, 2020
703
746
5,770
So many leaks and leakers, so little time....

PS: If there is such a thing as leakers who are "well-regarded", then why does TH even post / publish anything from the not so well-regarded ones?
 
  • Like
Reactions: ezst036

35below0

Respectable
Jan 3, 2024
1,639
690
2,090
The important point to be noted is that the highest wattage being tested is a 600W SKU, while the lowest wattage is a 250W SKU.
Is that for the 5090 and 5080, or....
The 4060 draws 115w. If the 5060 draws more than double, then what should it's performance increase over the 4060 be? Astronomical?

If the rumored price tag of ~$350-400 is true, then nvidia will need a 5050 or even a 5030 model.

Unrelated, but RTX 5050 seems like a marketing point waiting to happen, i just can't figure out how anything 5050 is actually good.
 

abufrejoval

Reputable
Jun 19, 2020
441
301
5,060
Also, it is rumored that the RTX 5090 would feature a 3-layer PCB design. Approach this leak/info with caution though, since nothing has been confirmed officially.
Pretty sure you meant a three PCB design not a 3-layer PCB design :)

3-layer PCB is both a bit odd and rather antique, I'd say.

No idea how many PCB layers GPU cards need these days, when every extra millimeter of trace length is a nightmare to handle.
 
Is that for the 5090 and 5080, or...

No specific SKU was mentioned, but it's obvious the max 600W TDP prototype being tested should be for the flagship cards, most likely the RTX 5090. With the 5080 using a slightly lower max TDP config.

While the lowest wattage 250W SKU being tested should be utilized in 'upper mainstream' and/or mid-range cards, depending on how we can segment them these days.

I mean to say the max TDP of the flagship GPU will remain within the 600 Watt bracket for the RTX 50 gen of cards, as AIBs have claimed. I have been in talks with some of them as well.

Whether this really pans out in the final "retail" product, still remains to be seen, because they can make some last minute changes as well (but this seems unlikely though).
 
Last edited:

abufrejoval

Reputable
Jun 19, 2020
441
301
5,060
I guess going extra wide with consumer cards was always seen as Nvidia's way to protect against them straying into servers.

But, of course, there is also a noise benefit when these boards need to dissipate 600 Watts or more.

But market segmentation these days isn't entirely vendor driven, export bans become perhaps more important.

These days Nvidia seems mostly bent on selling to China. So perhaps a dual slot variant is supposed to push consumer cards into Chinese data centers before they are being put on the embargo list?

Just having flexibility would be nice and not just for the Chinese. I've very much disliked having the flexibility slots provide only to have them covered up by some big fat GPU.

I got a triple slot PNY RTX 4090 just to make sure I could use a 8x/8x configuration if I had to and it allowed me to do some dual GPU LLM tests with an RTX 4070, also PNY which again is an extra compact dual slot design.

For a single slot GPU, I might actually go whet, which I've very much tried to avoid otherwise.
 

Eximo

Titan
Ambassador
Unrelated, but RTX 5050 seems like a marketing point waiting to happen, i just can't figure out how anything 5050 is actually good.

No 4050 yet (unless you count mobile), so not likely to see a 5050 anytime soon. They just released a new RTX3050 6GB. Low end card market doesn't need to be on the latest architecture anyway, and we've seen that quite a few times.
 
  • Like
Reactions: Metal Messiah.
Feb 2, 2024
82
49
60
If history is any guide and using a bit of algebra, the 5090 should consume somewhere around 500-550watts. This power plug is basically the same (irrc) as the older one and so i cant see them going above 600watt.

Yeah... the age of needing air conditioning to game is apon us.
 
I'll be frank ("Hi, I'm Frank...") here and say that I would be absolutely shocked if Nvidia goes with a 2-slot design on the RTX 5090 Founders Edition. I have all the various Founders Editions from the past decade or so, and a lot of those can get pretty freaking toasty. You know what two of the worst offenders are?

RTX 3080 10GB and RTX 3080 Ti. Both dual-slot cards, and still with 320W and 350W TGP, respectively. (Titan RTX and 2080 Ti are also super hot, with surface temps of well over 70C while gaming unless you have a separate large fan blowing at them.)

It's not impossible to cool more power and heat than that in a dual-slot design, but mostly that will require a lot more airflow and materials that dissipate heat better. And the real kicker is that there's no indication the market as a whole is particularly worried about the use of 3-slot and even larger top-end graphics cards.

We still don't know specs, but if the consumer Blackwell stuff sticks with TSMC 4NP (like B200 AI/datacenter), it would require a massive reduction in power consumption to keep a dual-slot card in check. Or wind tunnel fans would also do the trick. Even with exotic materials, removing 450W from a dual-slot volume will be very difficult.

I think there will absolutely be dual-slot Founders Edition cards, of course. They just won't be for the 5090. A 5080 in dual-slot trim? Yeah, that's possible, especially if current rumors of 256-bit memory interface compared to 512-bit on the 5090 end up true.

Let me take it a step further: I will be very disappointed in Nvidia if it takes a step back and uses dual-slot on 5080 or 5090. Again, I've got the old 2080, 3080, 4080 cards. The 4080 shines in comparison to the previous generations, thanks in no small part to its use of the 4090 cooling solution. Or perhaps I'll be super impressed and Nvidia will make a dual-slot 350W or higher power draw card that doesn't burn my fingers if I touch it while running games. Past history suggests that won't happen, though.
 

DS426

Great
May 15, 2024
67
41
60
The only way I'll be impressed with the 5090 (FE) is if they go with a dual-slot design and therefore comes in at 350W or lower. De8auer made a very strong case on how ATX 12VHPWR shouldn't be utilized at anywhere near it's theoretical limit as its tolerances are WAY lower than ATX 8-pin PCIe connector. Even the newer revised standard has caused problems in some cases, although it is definitely a step forward.

Anyways, nVidia is already invested in the high-power standard, but they can still limit burnout cases and therefore warranty and liability headaches by limiting power draw. Just let overclockers take TGP/TBP through the roof on their own accord, and similarly, AIB designs will push the power limit but using oversized triple-slot coolers. Power draw can't just keep going up indefinitely gen on gen as, well, last I checked, most electricity in the world isn't being generated from renewal energy. Oh and nope, no Mr. Fusion yet to cheat us into almost infinite energy.
 

vanadiel007

Distinguished
Oct 21, 2015
237
233
18,960
I'll be frank ("Hi, I'm Frank...") here and say that I would be absolutely shocked if Nvidia goes with a 2-slot design on the RTX 5090 Founders Edition. I have all the various Founders Editions from the past decade or so, and a lot of those can get pretty freaking toasty. You know what two of the worst offenders are?

RTX 3080 10GB and RTX 3080 Ti. Both dual-slot cards, and still with 320W and 350W TGP, respectively. (Titan RTX and 2080 Ti are also super hot, with surface temps of well over 70C while gaming unless you have a separate large fan blowing at them.)

It's not impossible to cool more power and heat than that in a dual-slot design, but mostly that will require a lot more airflow and materials that dissipate heat better. And the real kicker is that there's no indication the market as a whole is particularly worried about the use of 3-slot and even larger top-end graphics cards.

We still don't know specs, but if the consumer Blackwell stuff sticks with TSMC 4NP (like B200 AI/datacenter), it would require a massive reduction in power consumption to keep a dual-slot card in check. Or wind tunnel fans would also do the trick. Even with exotic materials, removing 450W from a dual-slot volume will be very difficult.

I think there will absolutely be dual-slot Founders Edition cards, of course. They just won't be for the 5090. A 5080 in dual-slot trim? Yeah, that's possible, especially if current rumors of 256-bit memory interface compared to 512-bit on the 5090 end up true.

Let me take it a step further: I will be very disappointed in Nvidia if it takes a step back and uses dual-slot on 5080 or 5090. Again, I've got the old 2080, 3080, 4080 cards. The 4080 shines in comparison to the previous generations, thanks in no small part to its use of the 4090 cooling solution. Or perhaps I'll be super impressed and Nvidia will make a dual-slot 350W or higher power draw card that doesn't burn my fingers if I touch it while running games. Past history suggests that won't happen, though.

Maybe performance is so high they do not need to push the CPU far to increase performance over the 4090. In that case power consumption would be low, and you can get away with a smaller cooler.

And maybe the performance is in the territory of what a 4090Ti would have been, so it's only a small uptick as opposed to a big leap forward.
 

Phaaze88

Titan
Ambassador
If history is any guide and using a bit of algebra, the 5090 should consume somewhere around 500-550watts. This power plug is basically the same (irrc) as the older one and so i cant see them going above 600watt.

Yeah... the age of needing air conditioning to game is apon us.
What..? It's been here, especially for those of us in the tropical/sub-tropical climates.
 
  • Like
Reactions: Makaveli

ekio

Reputable
Mar 24, 2021
98
123
4,710
I hope they got rid of this piece of crap power connector that is a big engineering failure.

What's the point of miniaturizing a connector when the GPUs are so huge anyway... ?
 
Maybe performance is so high they do not need to push the CPU far to increase performance over the 4090. In that case power consumption would be low, and you can get away with a smaller cooler.

And maybe the performance is in the territory of what a 4090Ti would have been, so it's only a small uptick as opposed to a big leap forward.
Nvidia would need to pull a rabbit out of its hat to get significant performance improvements while sticking with a revised version of the existing node. As noted above, if consumer Blackwell GPUs still use 4NP, the only way to get more performance is with bigger chips and more power.

Look at Blackwell B200: Dual-die, two full reticle sized chips linked via NV-HBI, and power basically doubled versus Hopper H100. That's because Blackwell B200 is using TSMC 4NP. I won't be surprised if Blackwell B202 ("RTX 5090") ends up as a dual-die solution as well, but that still won't reduce power consumption relative to the existing hardware. Well, it could... if Nvidia lowers the clocks a lot to get into the sweet spot for power efficiency.
 

jlake3

Distinguished
Jul 9, 2014
73
84
18,610
I remember that when another leaker reported that they'd heard about the 4-slot "Titan Ada" cooler shown in OC3D's tweet and mocked up a render based on pictures they weren't allowed to show, this leaker insisted their leaks were fake and they'd gotten it wrong. Other than the orientation of the ports, the render ended up being pretty close.

Now we've got pictures of the cooler, sources allegedly saying they're going to use the cooler, and industry trends are going in the direction of bigger coolers... and this leaker says the flagship is actually gonna be slim without presenting anything really to back that.

Not sure if he's being fed bad info or doesn't like the trend towards bigger coolers or what, but something doesn't add up.
 
  • Like
Reactions: scottslayer

CmdrShepard

Prominent
Dec 18, 2023
435
325
560
If they have a chip with very large surface and a large heat spreader on it (think Xeon sized) that will allow more efficient transfer of heat which may result in a slimmer card.

Other than that? I see no way you can cool 600W GPU in a 2-slot card unless it's a waterblock and has a 360mm radiator or bigger.