News Nvidia RTX 50-series Blackwell designs and specifications expected to be finalized this month — RTX 5080 rumored to have 400W TGP

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
The 70% rule is relatively accurate for the reasons stated above, but it's easy to overspec PSU and end up running well under efficiency whenever you're not heavily loading your system. A lot of people don't tend to take into account how bad the efficiency under 10% load is (and even up to 20% for some PSUs) because it's such a small number. Unless you're heavily loading your system most of the time though you can easily end up drawing more power over time by overspecing your PSU than by targeting 70% in your most common workload. Even at 600W if you're running something like the 7800X3D an 850W PSU is going to be fine it's only when you're looking at Intel/non-X3D that you'd need to move to 1000W.
This also leaves room for transient spikes that have become very prevalent in recent years.
ATX 3.x has this completely covered.
 
>So, my general opinion is that you don't want to run a PSU at more than ~70% load.

The same applies to the AC outlet. For US household w/ std 120V/15A (1800W) circuit, safe use for continuous load is 80%, or 1440W. A 1600W PC won't have a continous load at that max level, but it's reasonable to say that it would require a dedicated circuit all to itself.
 
  • Like
Reactions: JarredWaltonGPU
I bought a used 3080 when I heard the news of the delay; however I can’t wait for the 5080 launch and see how much the GDDR7 and new architecture will impact the performance but I predict a significant generational leap.
Regarding the power consumption I am not really bothered as long as the idle and media playback consumption is low.
 
Seems like a good time to give up pc gaming.
Why?

There are a ton of options at more reasonable power consumption levels and prices that still provide a very good experience, even if they’re not the absolute top of the line, and I’m sure people here will be happy to provide recommendations. The 90-class cards are for enthusiasts who aren’t as concerned about price or efficiency as much as having “the best”, just like the Titan line was before them.
 
I am not understanding what is wrong with the 4000 series that we need a faster 5000 series. I think this is just too fast of a release strategy and causes those who spend their hard earned money on a 4000 card to loose out on a lot of resale value.

You could make the argument this is the way it has always been, but that does not mean it has to stay like this. The 4000 still has the performance crown, and has the RT advantage over any competitor.

I just don't see the point of upgrading from a 4000. Maybe upgrading from a 3000 makes more sense, but then again the 4000 series will likely receive some clearance discounts and the second hand market will provide some good deals on used 4000 cards that a 5000 card might not be the best choice either.
 
I am not understanding what is wrong with the 4000 series that we need a faster 5000 series. I think this is just too fast of a release strategy and causes those who spend their hard earned money on a 4000 card to loose out on a lot of resale value.

Actually, if the rumors are true, and RTX-50 series GPUs get released on 2025, that’s gonna be the slowest NVIDIA release ever, considering the fact that the company’s normal release cycle is two years.

And, as far the RTX-40 series is concerned, I can already think of half a dozen games that can bring 4090 to its knees. How many more will be added to that list before the end of the year? Indiana Jones? Heart of Chornobyl? And how many more come early 2025? And if we’ve already reached the point where the most powerful GPU currently available is not enough for 4K Ultra path traced gaming, at 60 FPS, what exactly do you think that means for the rest of the RTX-40 series?

As a 4090 owner, it pains me to admit it, but I fear my card is becoming obsolete more quickly than I anticipated.
 
Last edited:
  • Like
Reactions: JarredWaltonGPU
I am not understanding what is wrong with the 4000 series that we need a faster 5000 series. I think this is just too fast of a release strategy and causes those who spend their hard earned money on a 4000 card to loose out on a lot of resale value.

You could make the argument this is the way it has always been, but that does not mean it has to stay like this. The 4000 still has the performance crown, and has the RT advantage over any competitor.

I just don't see the point of upgrading from a 4000. Maybe upgrading from a 3000 makes more sense, but then again the 4000 series will likely receive some clearance discounts and the second hand market will provide some good deals on used 4000 cards that a 5000 card might not be the best choice either.
If you only game at 1080p or 1440p with RTX and frame-gen, sure.

Just take a look at how demanding Star Wars Outlaws is.
Yes, you can blame the game for poor optimization, and the devs using upscaling and frame-gen as crutches.
 
  • Like
Reactions: valthuer
As a 4090 owner, it pains me to admit it, but I fear my card is becoming obsolete more quickly than I anticipated.

I think it may be more about how games are made today vs before.

When hardware was truly expensive (Think $5000+ for an IBM compatible with a 1MB video card, then adjust for inflation), they tried everything they could to make games run efficiently. Very limited memory being a huge driving factor.

Now, systems have plenty of memory and GPUs are more or less up to the task. Game developers don't have to worry about making an efficient game, they can just pump them out as quickly as possible to make as much money as possible.

There are a few standouts like Id, still making games that look good and run on practically anything.

I should add that in some cases the visual impact of a game is the only real improvement they can make outside of good story telling. I think a lot of AAA gaming innovation has become pretty stale. But that may just be more related to my taste in games.
 
i was hooping for more power efficient cards. welp maybe 6000.

AMD is kind of already there. They weren't shooting for power efficiency though, but cost reduction. But the same could apply. More GPU area, lower power/mm2 until you reach a balance.

But to keep making faster GPUs rather than just cheaper ones, probably going to have to start using HBM and massively increase cache amounts to make up for the latency penalties from tiling and physical distance.

But then gamers would complain about the forced power limits. You could run such a GPU at 1000W and get more performance, and that is where we seem to end up.

Pascal was good for power efficiency, but compare a GTX1080 to an RTX 4070 class GPU. About the same die area and only slightly more power consumption. So goal achieved already. Just that they also make massively huge GPUs as well.
 
looks like we'll have to deal with melting connectors again.
Consensus is that the melted connectors and cables were a result of poorly seated connectors. The weak connections had large resistance values resulting in excessive heat. In response, most ATX3 PSUs are now shipping with a pack-in detailing how to handle/bend the 12VHPWR cable.
 
Actually, if the rumors are true, and RTX-50 series GPUs get released on 2025, that’s gonna be the slowest NVIDIA release ever, considering the fact that the company’s normal release cycle is two years.

And, as far the RTX-40 series is concerned, I can already think of half a dozen games that can bring 4090 to its knees. How many more will be added to that list before the end of the year? Indiana Jones? Heart of Chornobyl? And how many more come early 2025? And if we’ve already reached the point where the most powerful GPU currently available is not enough for 4K Ultra path traced gaming, at 60 FPS, what exactly do you think that means for the rest of the RTX-40 series?

As a 4090 owner, it pains me to admit it, but I fear my card is becoming obsolete more quickly than I anticipated.

But is it becoming obsolete because it lacks performance, or is it becoming obsolete because game titles are too demanding and require unrealistic hardware components?

A 4090 is a special card, only used by a small fraction of the gaming community. Most people use like a 4070 or a 4060.

The fact games are bringing a 4090 to it's knees tells me more about games than about GPU performance.
 
It's why I have chooser build a "T" Intel one can push the 4060Ti and keep the power levels at 200w at the wall... The heat is very low barely feel over the day.

Nvidia, AMD and Intel it's trying to COOK us on the last generations
 
why would i ever want to buy a gpu with a 400w tdp. what hot garbage!
If they shipped the exact same card at 200w would you be happy? Why? You can do it yourself in 10 seconds. I really don't get what the issue is.

It's like saying why would I ever buy a monitor with the brightness at 100% by default. Uhm, just reduce the brightness to your desired level?
 
  • Like
Reactions: helper800
If they shipped the exact same card at 200w would you be happy? Why? You can do it yourself in 10 seconds. I really don't get what the issue is.

It's like saying why would I ever buy a monitor with the brightness at 100% by default. Uhm, just reduce the brightness to your desired level?
400w underclock to 200w is a massive waste of money and preformance.
 
400w underclock to 200w is a massive waste of money and preformance.
Not necessarily. It isn't linear.

Take a look at the mobile parts and their performance compared to the desktop parts. GPU to GPU, not model to model for a more direct comparison and you will see that despite slower memory, lower power limit, and generally lower clocks, they are not half the speed despite sometimes having far less than half the power limit.

I often run my 350W 3080Ti at 280W, still boosts pretty high.
 
I set my 400W 12GB 3080 to 75% power (I also have a bit of an undervolt on the frequency curve) and basically forgot I'd done so. The performance is definitely lower, but unless I'm playing something very graphically intensive I never really notice at all.
I have my EVGA FTW3 3080 10gb at 80% power and it performs at about 95% the performance.