News Leaker says RTX 50-series GPUs will require substantially more power, with the RTX 5090 TDP jumping by over 100 watts

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
502
2,060
I choose my GPUs w/in a max TDP-range of 200W, no matter how the generation is skewed towards power-consumption. So, we'll see what actual gains there are in terms of performance-per-watt.
Exactly. As I'm running my 4090 locked at 320. The 5090 can be released at 2kwh for all I care, I'll lock it to 320 and see how much faster it is than my 4090 and if its worth keeping.
 

TeamRed2024

Upstanding
Aug 12, 2024
191
126
260
see how much faster it is than my 4090 and if its worth keeping.

That's another thing. Is the 5090 going to be ridiculously better than everything else like the 4090 is? Or is just going to be a bump improvement over the current 4090? If you ask me today do I need a new GPU the answer is a solid no. There's nothing out that really pushes the 4090 at least until UE5.
 
  • Like
Reactions: artk2219

pf100

Commendable
Feb 18, 2022
53
41
1,560
That's another thing. Is the 5090 going to be ridiculously better than everything else like the 4090 is? Or is just going to be a bump improvement over the current 4090? If you ask me today do I need a new GPU the answer is a solid no. There's nothing out that really pushes the 4090 at least until UE5.
When new "must-have features" are implemented for future games that will then run too slow on the 4090 (think Remnant 2-style horrible performance with the 3090) it's not that the 4090 is too slow in raster which is what you're probably thinking of, it's that the game has been made too hard to run without the 5090. The major point of the 5090 and 6090 isn't going to be so much that they do faster raster, it'll be something like faster path tracing in games that you can't turn off because that's how the lighting will be eventually implemented more and more where the raster speed isn't the problem but the lighting will keep the 4090's ray tracing abilities from getting you over 20 fps. The 5090's raster will be faster, yes, but that won't be the big deal with that card: it'll be double the performance of a 4090 in ray tracing along with maybe proprietary dlss4 that's required to get a playable frame rate. We already have graphics in games that look more than good enough, but nvidia will push real life graphics more and more that will then be implemented into game engines that make all existing gpu's run so slow that you can either play older games or if you want to play the new games at an acceptable frame rate you need a 5090. Alan Wake 2 has ray tracing that you can't turn completely off and playing it on a gtx 1080 ti at 1080p on medium with fsr will get you over the 30 fps hump. If you try playing it without upscaling (that's by chance provided to older nvidia cards by AMD), it's almost unplayable. That will most likely happen with the latest games with the 4090 (it'll be too slow) in a few years but the 5090 will get 60 fps or more. I really wish that they would stop making games look more realistic and that nvidia would stop making otherwise powerful gpu's obsolete - but they won't.
 
Last edited:

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
502
2,060
That's another thing. Is the 5090 going to be ridiculously better than everything else like the 4090 is? Or is just going to be a bump improvement over the current 4090? If you ask me today do I need a new GPU the answer is a solid no. There's nothing out that really pushes the 4090 at least until UE5.
Well the 4090 was around 80% faster than my 3090 with both running at 320 watts so it was a pretty good upgrade in both performance and perf per watt. I don't expect similar gains with the 5090 though but who knows.
 
  • Like
Reactions: TeamRed2024

pf100

Commendable
Feb 18, 2022
53
41
1,560
Well the 4090 was around 80% faster than my 3090 with both running at 320 watts so it was a pretty good upgrade in both performance and perf per watt. I don't expect similar gains with the 5090 though but who knows.
It'll be massively faster in ray tracing and path tracing.
 
  • Like
Reactions: artk2219

marnes

Distinguished
Jul 25, 2009
14
0
18,510
I develop video games for a living and so I also like to keep up with the latest. I bought the 3090 because it was so much better than anything out there, only for the 4090 to release barely a year later with 60% more performance, so I bought that too. But that new 12VHPWR power cable on the 40 series was a massive mistake. My 4090 connector melted about a year after purchase, so I had to RMA it and wait for a new one. If the 5090 still has it, hard pass. You need something a lot more robust than that skinny little plug, and it also needs to be in a different spot so it doesn't poke straight into the outer edge of your case allowing the cable bend to eventually incinerate your connector. I bought the 180 degree cablemod adapter that seemed to be great, until it got recalled. Had to buy a separate vertical riser mount to get it safe again. It's been a journey.

I also question what practical purpose a 5090 might have. The 4090 lets me game in 4K @ 120Hz with most things maxed out, depending on the game of course. And traditional DLSS is amazing, just not the frame gen which still commonly breaks UI. I literally took screenshots of Control with and without DLSS and I couldn't tell the difference zooming in and taking my time to try to spot the differences. At 120Hz, no longer matters at all at that scale.

Sure 8K gaming may be coming, but not at desktop screen view distances. An 8K screen would be 4x more demanding than a 4K screen. So looking likely I'll skip the next one. Of course, Path Tracing and RT will be worthy, but I wonder if it'll be able to handle high framerates at 4K.
 
Last edited:

aberkae

Distinguished
Oct 30, 2009
132
43
18,610
Same leaker said the 5090 will be a 2 slot design. 2 slot heatsink at 600 watts. 🤪
Meanwhile the 4090 has less power and has 3 slot+ design. I am pretty sure he is a Nvidia powered bot that attempts to spread misinformation via leaks and rumors to keep Nvidia in the News Cycle 24/7.
 

pf100

Commendable
Feb 18, 2022
53
41
1,560
I develop video games for a living and so I also like to keep up with the latest. I bought the 3090 because it was so much better than anything out there, only for the 4090 to release barely a year later with 60% more performance, so I bought that too. But that new 12VHPWR power cable on the 40 series was a massive mistake. My 4090 connector melted about a year after purchase, so I had to RMA it and wait for a new one. If the 5090 still has it, hard pass. You need something a lot more robust than that skinny little plug, and it also needs to be in a different spot so it doesn't poke straight into the outer edge of your case allowing the cable bend to eventually incinerate your connector. I bought the 180 degree cablemod adapter that seemed to be great, until it got recalled. Had to buy a separate vertical riser mount to get it safe again. It's been a journey.

I also question what practical purpose a 5090 might have. The 4090 lets me game in 4K @ 120Hz with most things maxed out, depending on the game of course. And traditional DLSS is amazing, just not the frame gen which still commonly breaks UI. I literally took screenshots of Control with and without DLSS and I couldn't tell the difference zooming in and taking my time to try to spot the differences. At 120Hz, no longer matters at all at that scale.

Sure 8K gaming may be coming, but not at desktop screen view distances. An 8K screen would be 4x more demanding than a 4K screen. So looking likely I'll skip the next one. Of course, Path Tracing and RT will be worthy, but I wonder if it'll be able to handle high framerates at 4K.
Hello video game developer. I have a 4090 too and I run it at 80% power limit at 1 volt at 2850mhz and it rarely goes over 350 watts and it loses very little performance like that. I have a case big enough for the card to fit horizontally (Fractal Torrent) with a new atx3 psu with a new 12vhpwr cable so I'm not too worried about the connector melting but of course I can't be 100% sure but I'm doing what I can. After thinking about it for a long time I can see how the cablemod adapter is more likely to melt the connector because it limits heat transfer because of plugs on each end of it and it's kinda tiny which sacrifices some of the ability of the cable to draw heat away from the connector due to heat transfer loss at each end of the cablemod adapter where it's connected. My 12vhpwr psu cable has a straight shot of solid wire from end to end to transfer heat away from the connector. It does require going above and beyond with not so obvious heat transfer methods away from the connector to have any sort of confidence that it won't melt. Nvidia should issue a recall of the 4090 even if you don't have warranty to replace the 12vhpwr connector with something better. I bought my barely-used-at-the-time 4090 from a techtuber who bought it to do benchmarks on camera and then he sold it to me. I don't have warranty even though I could probably beg the guy to rma it for me, but I know what I'm going to do if the 12vhpwr connector melts. Since I'm good at soldering I'm going to completely remove the anemic power connector from the gpu and take the "12vhpwr to four 8 pin" connector adapter that came with the card, cut the 12vhpwr connector off of the end of the adapter, and solder the wires directly to the card, and then I'll be able to connect the card to power with four 8 pin connectors eliminating the 12vhpwr connector entirely. There's no way in hell it'll melt like that.
 
Last edited:

laxman10100

Distinguished
Oct 27, 2012
11
3
18,515
This hurt my brain to read. But then again, that is likely because I am a freak and have had access to all of the consumer-important information regarding the 5090 for a pretty lengthy amount of time... But to keep things short, this article was total click bait. The 4090 is rated for 600W and rarely pulls over 400W for a split second here and there when necessary during the occasional power peak during 4K gameplay with max settings. So, 1o0 wAtTs mOrE fOr ThE 5o9o mEaNs yOu NeEd A nEw eVErYtHiNg!!! But in reality, it means that the 5090 may occasionally send a 500W power request (that most power supplies - even with the 12VHPWR cable - won't be able to send efficiently). When this happens and your PSU has poor power shift efficiency, you'll notice a semi-brief framerate stutter where your fps drops for a bit and then eventually returns to normal (generally once enough of the scene textures are loaded and the power request shrinks).

Aside from that, waiting to snag a resale 4090 for around $1000 would be a steal for ultimate gaming performance. Currently, my 4090 paired with a 7800X3D grants me well over 100fps in every single game at max settings at 4K with raytracing enabled and DLSS disabled. However, I plan on exchanging my 4090 for a 5090 and then doing the same thing again for the god almighty RTX.....6090....giggity giggity.

That is, of course, unless Nvidia decides not to renew their deal with TSMC for the manufacturing of their 6000 series chips... But we'll get the answer to that question within the first few months after the 5000 series drops.

Edit - Just wanted to clarify that my weird sentence at the top of this comment involved some intense sarcasm lol. My apologies if anyone took that the wrong way.
 

pf100

Commendable
Feb 18, 2022
53
41
1,560
no way, 40-50% at most, the full GB202 yields will probably be used in workstation cards
Raster isn't going to be the focus because everything is already fast enough in raster so it's not a huge priority but of course there will be a speed bump there (I hope). The focus is dedicating more die space to ray tracing and path tracing circuitry for the 5000 series because even the 4090 can choke in that regard in some scenarios requiring lots of upscaling and frame gen. With the 4090 raster is fine at native resolution without upscaling and frame gen but what's holding it back is under the same conditions path tracing can drop it down to 20 fps. Nvidia is going all in on making games as realistic as possible and they can't do that unless path tracing is made easier to run and that's where the speedup needs to be. I don't like that they're doing that because it means more expensive gpu's and I always thought raster and static lighting was fine anyway. I don't need "realistic" looking games because art style is far more important to me. I admit that I might be wrong with the 80% faster path tracing but I know where nvidia's priority lies - and that's path tracing.
 

subspruce

Proper
Jan 14, 2024
145
35
110
Raster isn't going to be the focus because everything is already fast enough in raster so it's not a huge priority but of course there will be a speed bump there (I hope). The focus is dedicating more die space to ray tracing and path tracing circuitry for the 5000 series because even the 4090 can choke in that regard in some scenarios requiring lots of upscaling and frame gen. With the 4090 raster is fine at native resolution without upscaling and frame gen but what's holding it back is under the same conditions path tracing can drop it down to 20 fps. Nvidia is going all in on making games as realistic as possible and they can't do that unless path tracing is made easier to run and that's where the speedup needs to be. I don't like that they're doing that because it means more expensive gpu's and I always thought raster and static lighting was fine anyway. I don't need "realistic" looking games because art style is far more important to me. I admit that I might be wrong with the 80% faster path tracing but I know where nvidia's priority lies - and that's path tracing.
yeah, so big jump in path tracing and 40-50% jump in raster and GPGPU compute loads