News Improving Nvidia RTX 4090 Efficiency Through Power Limiting

Neilbob

Distinguished
Mar 31, 2014
266
350
19,720
That looks pretty much conclusive to me. Except for those people who insist they can actually tell the difference between 100FPS and 102FPS, there's no reason not to run this monstrosity in a more power efficient manner.

I really wish the manufacturers would default to 75-80% (or less) power and let users decide via software/drivers if the extra less-than-5% performance is worth the bother, rather than chasing the performance-at-all- costs-because-benchmarks-are-all-important thing that seems to be going on lately.
 
Last edited:

bit_user

Titan
Ambassador
I really wish the manufacturers would default to 75-80% (or less) power and let users decide via software/drivers if the extra less-than-5% performance is worth the bother, rather than chasing the performance-at-all- costs-because-benchmarks-are-all-important thing that seems to be going on lately.
In the era of cut-throat benchmark competition, it's unrealistic to expect manufacturers to do this of their own accord.

I think a good first move would be some kind of energy labeling regulation that requires the energy usage of the default config to be accurately characterized. Similar to how we insist automobiles advertise their fuel-efficiency, according to a prescribed measurement methodology.
 

AgentBirdnest

Respectable
Jun 8, 2022
271
269
2,370
Wow! Those are really interesting results. I'm not gonna get a 4090, but it's still fascinating to know that I could lower the power by 30% or even 40% in the summer so I don't get cooked in my gaming room, and would only miss out on ~10% performance that I probably wouldn't notice.

Awesome work, Jarred. I've really wanted to see some tests exactly like this.
 
Wow! Those are really interesting results. I'm not gonna get a 4090, but it's still fascinating to know that I could lower the power by 30% or even 40% in the summer so I don't get cooked in my gaming room, and would only miss out on ~10% performance that I probably wouldn't notice.

Awesome work, Jarred. I've really wanted to see some tests exactly like this.
There are going to be cases where the losses are larger (ie, Metro Exodus Enhanced), but yeah, Nvidia really stomped on the performance pedal — screw efficiency! Which is interesting as other Ada Lovelace parts (like the RTX 6000 48GB) aren't pushing nearly as hard.
 
Yes, let's keep reducing the power of the most expensive hardware we buy going forward to get less performance so we can manage the heat and power outputs. Makes perfect buying sense!

Why are we allowing Companies to pass that burden to us, again?

Regards.
 
  • Like
Reactions: PEnns

bit_user

Titan
Ambassador
Why are we allowing Companies to pass that burden to us, again?
Benchmarks, benchmarks, benchmarks! Being the top-performer translates into premium prices and greater sales volume. The temptation to turn up clock speeds just a little more has proven too great form companies to resist.

For efficiency to be prioritized, the market needs to value it. As I've been saying, a step towards that would be some standardization of metrics that can be used to compare products on that basis.
 
At 50% your connection cable only boiling water temperature, at 80% given the best power results and you are up to cold fusion.... at 100% on average benchmarks your power connection has the sun fusion and will burn every single thing on earth.
 
Benchmarks, benchmarks, benchmarks! Being the top-performer translates into premium prices and greater sales volume. The temptation to turn up clock speeds just a little more has proven too great form companies to resist.

For efficiency to be prioritized, the market needs to value it. As I've been saying, a step towards that would be some standardization of metrics that can be used to compare products on that basis.
To quote Gordon: "bigger bar, better" mentality.

So we need to put more emphasis on the efficiency of tasks then? For every game running at 600FPS, we also get the ms/watt** chart and drive conclusions based on that, yes?

Regards.

** ms/watt as in: frame time per watt.
 

PEnns

Reputable
Apr 25, 2020
702
747
5,770
Sure, let's dump this time-bomb into the consumers' laps and then let Nvidia-friendly websites figure out how to diffuse it !! Because preventing the expensive pyrotechnics would be too much to ask.

Pathetic.
 

brandonjclark

Distinguished
Dec 15, 2008
588
253
20,020
Sure, let's dump this time-bomb into the consumers' laps and then let Nvidia-friendly websites figure out how to diffuse it !! Because preventing the expensive pyrotechnics would be too much to ask.

Pathetic.


It's not like people's homes are burning down. Serious.

It's some fried connectors, and probably very FEW of them. Percentage-wise, it's likely statistically irrelevant.
 
  • Like
Reactions: bolweval

coolitic

Distinguished
May 10, 2012
735
82
19,090
You can just get a lower-tier card, like a 4070, and have higher performance/power than a 4080 whose power limit is set to mimic said 4070. Plus you wouldn't spend as much on the card.

Frankly, especially after these last two generations of GPUs, I'm going to buy according to whatever performs the best at a TDP limit of my choosing; probably 200W.
 

Sleepy_Hollowed

Distinguished
Jan 1, 2017
536
237
19,270
This is absolutely bonkers. There's zero reason for shipping this card with 100% target power, and like others said, should've been a choice via drivers or loading another BIOS on it.

Not that I'm getting one, but I would absolutely lower it to 80 like you said just to be sane.
 

PEnns

Reputable
Apr 25, 2020
702
747
5,770
It's not like people's homes are burning down. Serious.

It's some fried connectors, and probably very FEW of them. Percentage-wise, it's likely statistically irrelevant.

Yeah, sure, just a GPU worth $1600 - $2000 being damaged. It's just money being burnt. Pun fully intended.

I am sure you'd sing a different tune if it was an AMD's GPU!!