News RTX 3090 Ti Is a Surprisingly Efficient Gaming Beast When Limited to 300W

Underclocking one card and calling it the efficiency king is...misleading.

Still, as mentioned, not hard to believe that you can gain a lot of efficiency by not running a chip at it's maximum frequency (+55% power consumption for +11% frequency). Then again, the 3090Ti was designed to be an FPS chart-topper. Throw all common sense out the window.
 
Last edited:
  • Like
Reactions: thisisaname
I've seen this happening with my RTX 2070 Super. In Final Fantasy XIV, if I let the card go full ham, it'll happily go up to around 220W. But if I power limit it to about 80%, I get zero performance drops.

So I took this further and undervolted my card. At what I have set currently, it's limited to about 1935MHz but sits at 0.950V and consumes about 75%-80% as much power. And I added another profile on MSI Afterburner to see how low I can get it to still run the stock boost speed of 1800MHz. It goes down as much as 65%. Normally if I let the card do its thing, it can get up to about 2050MHz. So I could get a theoretical performance gain of ~14%, but I'd have to consume at least 70 more watts getting there.

Similarly I noticed when I was doing Handbrake runs, I left my CPU's turbo disabled on accident. However, I found that I only lost about 15% performance, but the CPU was also consuming 60% of the power.

I'm starting to not see the point of full on turbo boosting unless I need to polish my e-peen.
 
We've been conditioned by looking at performance charts primarily. The consumer wants to be high on the performance chart. Power consumption is a back-burner topic as long as it's manageable.

AMD/Intel/Nvidia can get (essentially) free performance out of boosting frequency at the cost of power. As long as their competition isn't absolutely blowing them out of the water on power draw and/or power draw is manageable for heat dissipation and noise, they know that there will only be a handful of press reviews exploring/mentioning power efficiency. And in the end, the consumer defaults to the performance chart. So, like Intel, if you're topping the performance charts, people don't seem to care much that it's costing nearly 2x the power draw compared to AMD.

There's just so little literature [like this] out there to help the consumer realize how little performance they'd give up for a >>> decrease in power consumption/heat/noise.



FWIW - I run my 3060Ti at 165W TDP.
 
I think the die size of the apple M1 max/ultra/whatever are a telltale sign of Apple's huge profit margins giving them a lot of design freedoms that "typical" chipmakers don't have. Much like the 300W 3090Ti article shows, if you supply a larger amount of hardware to achieve a desired performance at less than the maximum/peak frequency, you get major efficiency benefits.
 
  • Like
Reactions: thisisaname

Endymio

Reputable
BANNED
Aug 3, 2020
725
264
5,270
"At typical power costs, it would take around 3,300 days of 24/7 use to make up for the $800 [power cost] difference ..."

In an air-conditioned climate, you not only pay the direct power cost, but nearly as much to pump the excess heat out of your home as well.
 
Last edited:
  • Like
Reactions: thisisaname

thisisaname

Distinguished
Feb 6, 2009
773
424
19,260
"At typical power costs, it would take around 3,300 days of 24/7 use to make up for the $800 [power cost] difference ..."

In an air-conditioned climate, you not only pay the direct power cost, but nearly as much to pump the excess heat out of your home as well.

If you live somewhere cold it does make a good space heater :)
 
May 21, 2021
126
66
170
I don’t understand what the big deal is, I run my 6900xt at the stock 289W but with a -0.125V offset and it’s faster than stock by about 8%, never got an article boasting that….
 

Phaaze88

Titan
Ambassador
I don’t understand what the big deal is, I run my 6900xt at the stock 289W but with a -0.125V offset and it’s faster than stock by about 8%, never got an article boasting that….
Well, if you look at some of the vbios files over yonder at TPU... excluding that transient power draw stuff, a 6900XT has power limits around that of a 3070-3070Ti.

They already had 'em beat with ray tracing and DLSS - I guess NVENC and Shadowplay too - but geez...
 

King_V

Illustrious
Ambassador
I found myself some time back saying "Is Nvidia pulling a Vega move?" with the idea that they're saying "damn the torpedoes" when it comes to the grossly diminishing returns for pushing the power limits higher.

This seems to confirm it.
 
  • Like
Reactions: tennis2

aalkjsdflkj

Honorable
Jun 30, 2018
45
33
10,560
Can anyone explain the importance of Igor modifying the VF curve? Is this necessary when reducing the power limit? Could we expect the results to be close if he hadn't modified the VF curve, or is that required?
 
Can anyone explain the importance of Igor modifying the VF curve? Is this necessary when reducing the power limit? Could we expect the results to be close if he hadn't modified the VF curve, or is that required?
Undervolting. Apply less voltage (compared to the stock curve) at a given frequency. Maximizes efficiency further since it allows higher frequency at the same voltage/power draw.
Details are pretty sparse in the article.
 

watzupken

Reputable
Mar 16, 2020
1,007
507
6,070
The result is hardly surprising actually. The problem with the RTX 3090Ti is that it has gone way past the “sweet spot” clockspeed that Samsung’s 10nm can provide, thus, requiring a lot more power to get past that 2+ Ghz clockspeed. Those who managed to undervolt their RTX 3090/ 3080 should know that it is actually possible for the cards to not lose a lot of performance, while dropping power fairly significantly.
 
The result is hardly surprising actually. The problem with the RTX 3090Ti is that it has gone way past the “sweet spot” clockspeed that Samsung’s 10nm can provide, thus, requiring a lot more power to get past that 2+ Ghz clockspeed. Those who managed to undervolt their RTX 3090/ 3080 should know that it is actually possible for the cards to not lose a lot of performance, while dropping power fairly significantly.
Considering that, at least in my experience, every part is grossly inefficient if you let it run with stock settings and go full bore, I think this is just AMD, Intel, and NVIDIA all pining for those sweet benchmark scores that consumers care about.

Heck I should add most mainstream electronics manufacturers as well since Samsung was recently found throttling on the Galaxy S22, except in benchmarking apps.
 
  • Like
Reactions: King_V
Considering that, at least in my experience, every part is grossly inefficient if you let it run with stock settings and go full bore, I think this is just AMD, Intel, and NVIDIA all pining for those sweet benchmark scores that consumers care about.
It's inevitable. More resources on a chip means a larger, more expensive chip. If your competitor can extract some extra performance from simply turning frequency up to 11 whilst costing little/nothing in terms of BoM (better power delivery, better cooler), you're forced to do the same to counter. Frequency is pure profit in a sense. As a result, we've progressed to the era where everyone's CPU/GPU is automatically leaving very little manual OC potential on the table.
 

King_V

Illustrious
Ambassador
volume-11.png