News Leaker says RTX 50-series GPUs will require substantially more power, with the RTX 5090 TDP jumping by over 100 watts

punkncat

Polypheme
Ambassador
Honestly, this is ridiculous. We already have a generation using a ton of power, producing a lot of heat, and an ongoing issue with a new power connector melting and/or catching fire.

Some dipstick at Nvidia "Hey, I know! Let's juice with even MORE wattage."

I thought one of the advantages of the more modern parts were increases in performance using less power. It seems that both the CPU and GPU manufacturers just threw all that out the window.
 

andrep74

Distinguished
Apr 30, 2008
14
19
18,515
I also set my goal to 200W, and have even lowered my CPU power budget to achieve this. And while the limit is entirely arbitrary, I feel like a computer shouldn't draw the same power as *a hair dryer* just to play a video game.

I can't but wonder if this death spiral of power consumption is driven by irresponsible users, or else greedy corporations. It's a certainty that governments will step in and make sure that consumers cannot purchase graphics cards above a certain threshold wattage (maybe 100W?) at some point.
 

DougMcC

Reputable
Sep 16, 2021
181
127
4,760
I also set my goal to 200W, and have even lowered my CPU power budget to achieve this. And while the limit is entirely arbitrary, I feel like a computer shouldn't draw the same power as *a hair dryer* just to play a video game.

I can't but wonder if this death spiral of power consumption is driven by irresponsible users, or else greedy corporations. It's a certainty that governments will step in and make sure that consumers cannot purchase graphics cards above a certain threshold wattage (maybe 100W?) at some point.
Government action seems extremely unlikely. Particularly as a growing percentage of people who would have such high power PCs are now also running their own power plants.
 

emike09

Distinguished
Jun 8, 2011
192
184
18,760
I can't but wonder if this death spiral of power consumption is driven by irresponsible users, or else greedy corporations.
I probably fall under the irresponsible user category. Power is cheap where I live, and I have more than adequate cooling in my case. Outside of melting connectors, I'll take all the power I can get, as long as there has also been an improvement in efficiency per watt compared to the previous generation. The xx90 series (and previous x90 series) has always been about pushing the limits and maximizing performance. It's focused on power users, workstations, and enthusiasts who want or need everything they can get from a card. With the death of SLI, we can't just pop another card in and increase performance, though some workstation applications can scale with multi-GPU setups.

It's too early to tell what efficiency improvements have been made on Blackwell but looking forward to seeing details. Gotta put that 1600w PSU to use somehow!
 

Notton

Commendable
Dec 29, 2023
862
755
1,260
Some guess work
5070: 220W
5080: 250~300W
5090: it's 2x 5080 dies glued together

16-pin 12VHPWR spec defines 600W per connector

So it's either 600W, or go back to the "ugly" mess of 32 cables on one card 12VHPWR was trying to solve.
 

Heiro78

Prominent
Nov 20, 2023
57
36
560
Some guess work
5070: 220W
5080: 250~300W
5090: it's 2x 5080 dies glued together

16-pin 12VHPWR spec defines 600W per connector

So it's either 600W, or go back to the "ugly" mess of 32 cables on one card 12VHPWR was trying to solve.
It's scary to think that the card will run at 550w and only have a 50w buffer. I don't necessarily trust all the AIBs to properly cap the wattage
 
  • Like
Reactions: artk2219

Heiro78

Prominent
Nov 20, 2023
57
36
560
I suppose this is an improvement then. But won't it be the same issue all over again? With multiple cables coming out? My setup uses an adapter from 8-pin to the 12VHPWR
 
  • Like
Reactions: artk2219

Eximo

Titan
Ambassador
Two of the new connectors is about the same area as 3 8-pin. Weren't too many cards that reach 4 8-pin.

The alternative they could have done was the 336W EPS cables and just run a pair of those. Though there would have been dual 8-pin to EPS adapters in that scenario, so adapter cables either way.
 
  • Like
Reactions: artk2219

TeamRed2024

Upstanding
Aug 12, 2024
187
119
260
Nvidia has 2.7 Trillion reasons to not care what you think about its AI accelerators GPUs.
Hahah.... I'm honestly not sure I'm gonna care all that much about the 5090... but I am curious as to what the MSRP might be? $2500?

4090 already runs everything I throw at it in 4K 60 without breaking a sweat... and even the UE5 demos I've ran are still 50+ fps... this is a stock setup with no OC.

Why do I need a 5090? :ROFLMAO:
 

hollywoodrose

Distinguished
Oct 22, 2011
2
3
18,515
Hahah.... I'm honestly not sure I'm gonna care all that much about the 5090... but I am curious as to what the MSRP might be? $2500?

4090 already runs everything I throw at it in 4K 60 without breaking a sweat... and even the UE5 demos I've ran are still 50+ fps... this is a stock setup with no OC.

Why do I need a 5090? :ROFLMAO:
I built my first pc last year and went with a 3060ti. I spent more on the cpu, etc (7800x3d, 32gb Gskill Trident @ 6000, 2Tb Samsung m2 980 pro) and figured that I could upgrade the gpu later when I saved up some more money. I almost bought a 4090 but I literally could not get my hands on a founders edition! Lol

But then rumors of the 5090 started to surface so I set my sights on that. Everything sounded so good until the past 3 months! First there was news of the smaller bus and 28GB of ram instead of 32. Then came the “design flaw” and the delay.

And now the fallout from the whole mess is leading to possibly higher temps and a bigger power draw. This doesn’t sound good! I don’t know how they missed this design flaw for so long in testing and it’s only surfacing now. But all things considered, I don’t think I’d trust the 5090 enough right now to buy one at launch. In fact I’m thinking about just grabbing a 4090 instead. None of this sounds good!
 

pf100

Commendable
Feb 18, 2022
53
41
1,560
The 5090 will have one 12VHPWR connector because using two would cost an extra dollar plus nvidia likes to shove it in your face that you'll use this connector and you'll like it because everyone by now is gaslighted into liking it even if they really didn't like it at first. Then everyone will nod their head and say how much they like it and when the Nvidia Mind Control is in full effect everyone will yell in capslock to anyone with a burnt connector blaming them for not "plugging it in all the way" even though the connector is not overprovisioned and it runs right on the edge of its allowable current rating at least half the time and in the case of the 5090 95% of the time plus some extra overcurrent for good measure. The end result is lots of burnt connectors and lots of yelling in capslock and nvidia comes out smelling like a rose. Again. And everyone cheers.
 
Wonder if NV having reached the end of the line with current tech, if they have consistently pump more power through their chips to get gains on previous generation.

Intel been through this with the Pentiums and had to change course to the Core series, which cost them money and time over AMD and again with last series and now they switching to BigLittle cores. Even AMD's Zen's power requirements on some chips sneaking up.
 

Phyzzi

Commendable
Oct 26, 2021
11
14
1,515
My power bill isn't so much of a concern to me, as I usually manage to offset a good chunk with solar, even charging an EV regularly... but (typical to the USA) a bunch of my breakers are rated at 10 amps, or about 1200 watts total, and pushing near that much into just a computer means I would have to run it a dedicated outlet... so no thanks. I don't think I have to set a 200 watt cap, but I do solidly feel like 600 watts between the CPU and GPU is a pretty hard limit for anyone who is running their computer on normal house outlets and probably I will try and stay under 400 watts between the two, given I would also like to run a monitor, speakers, possibly some chargers, and sometimes a VR headset or something off the same outlets.
 

TeamRed2024

Upstanding
Aug 12, 2024
187
119
260
Futureproofing, or for those on older generations. I always upgrade on release because the previous gen still has high resell value. Makes for a much smaller monetary contribution to stay on the latest platform.

Makes sense... and being that you said that I'll probably do the same. The 4090 is leaps and bounds better than anything else which will also help with resale. New build here and the 1080 Ti was so good I kept it for 7 years. :ROFLMAO: Definitely won't be first in line for a 5090 but will get on the list.