News Improving Nvidia RTX 4090 Efficiency Through Power Limiting

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
@JarredWaltonGPU , I just noticed something missing. There's not an actual plot of efficiency, like in terms of FPS/W.

Based on eyeballing the plot (second image, first figure), here's what I got:

Power RatioBoard Power (W)Mean FPS99% FPSMean FPS/W99% FPS/W
1.2​
425​
64.8​
77.6​
0.15​
0.18​
1.1​
410​
64.7​
77.3​
0.16​
0.19​
1.0​
390​
64.9​
76.9​
0.17​
0.20​
0.9​
370​
64.3​
76.3​
0.17​
0.21​
0.8​
340​
63.2​
75.1​
0.19​
0.22​
0.7​
310​
61.9​
73.1​
0.20​
0.24​
0.6​
260​
58.4​
68.9​
0.22​
0.27​
0.5​
210​
50​
59.3​
0.24​
0.28​

A plot of the last two columns would really show how dramatically efficiency increases as power is decreased. That said, it's interesting how the 99% efficiency column suggests 50% power might be nearing a plateau or even a peak.
 
Last edited:
In the era of cut-throat benchmark competition, it's unrealistic to expect manufacturers to do this of their own accord.

I think a good first move would be some kind of energy labeling regulation that requires the energy usage of the default config to be accurately characterized. Similar to how we insist automobiles advertise their fuel-efficiency, according to a prescribed measurement methodology.

What I would like to see is a BIOS switch option on some cards like say Noctua editions that flips a power limited mode on that is like 60-70% power instead of an OC option. I bet a tweaked BIOS could get even better efficency. This may not be the best one but the 4080 16GB or AMD's new RDNA3 cards might be. Anyhow I would love to see some options like that. These larger cards will outperform the lower end cards even when power limited since so much GPU work is parallel. I personally would "over pay" and power limit a card just to get more perf at say 200-250w.
 
Could someone give me a hint on how I can limit the power of my Nvidia raphics card, please? Is there a tool for this?
 
Why are people and reviewers spreading lies and mis-information about the RTX 4090's power usage?

Total System Power @1440p ULTRA on Battlefield 2042 is less than 400w. With DLSS and RT on, it's actually between 300-400w, uncapped.
World of Warcraft Ultra + LiveStreaming using NVENC - 250-300w.

BF2042 on 3 difff 4090s in 3 diff systems.

Please tell me where you get these insane numbers from! Not even past 3050mhz can I get beyond 500w total system power!
 
Why are people and reviewers spreading lies and mis-information about the RTX 4090's power usage?

Total System Power @1440p ULTRA on Battlefield 2042 is less than 400w. With DLSS and RT on, it's actually between 300-400w, uncapped.
World of Warcraft Ultra + LiveStreaming using NVENC - 250-300w.

BF2042 on 3 difff 4090s in 3 diff systems.

Please tell me where you get these insane numbers from! Not even past 3050mhz can I get beyond 500w total system power!
Nah, it's not lies and misinformation. It is CPU limitations.
The GPU can only go as far as the CPU does. In the games you mention, the CPU bottlenecks the performance before a 4090 (or even 3090, as Jarred reported in his benchmarks for BF2042) can fully stretch its legs. WoW isn't too graphically demanding, so it is hitting the CPU's limit long before the GPU can max itself out, and NVENC doesn't use much power at all.
Games like Cyberpunk 2077 or Metro Exodus Enhanced, that have the absolute-most demanding graphics and use multiple types of ray-tracing, and running at 4K resolution, will use 400+ watts because the 4090 is unconstrained. The GPU is running on all cylinders, putting out frames at a low enough speed that the CPU can keep up in these games.
 
Why are people and reviewers spreading lies and mis-information about the RTX 4090's power usage?

Total System Power @1440p ULTRA on Battlefield 2042 is less than 400w.
That wasn't one of the games he tested. Nor was WoW Ultra. In another article examining the RTX 4090's FPS/W, I found some games that underutilized the GPU also used significantly less power. Maybe you've stumbled upon two such titles? What was the average % utilization for those games?

In this post, I computed the average power usage over games with average utilization over 95%:

(Summary: average power was 422.2 W and 428.9 W for 1440p and 4k, respectively)​


The real kicker is when I computed these same averages over the games with less than 95% utilization:
(Summary: average power was 332.5 W and 358.6 W for 1440p and 4k, respectively)​


This shows the average power usage can definitely vary with the game, particularly on its average % utilization. If you can collect that data, it would be useful to compare with my analysis of Jarred's prior data.

The raw data I used was provided by Jarred, here:
158


Note how the lowest among those is Far Cry 1440p Ultra, at only 232.6 W!

Please tell me where you get these insane numbers from!
Aside from some games using less power than others, did you look at the driver versions?
 
It is CPU limitations.
The GPU can only go as far as the CPU does. In the games you mention, the CPU bottlenecks the performance before a 4090 (or even 3090, as Jarred reported in his benchmarks for BF2042) can fully stretch its legs. WoW isn't too graphically demanding, so it is hitting the CPU's limit long before the GPU can max itself out,
Thanks! That lines up with my conjecture. We'd know for sure, if we could get average % utilization figures, on the RTX 4090, for those games.

Games like Cyberpunk 2077 or Metro Exodus Enhanced, that have the absolute-most demanding graphics and use multiple types of ray-tracing, and running at 4K resolution, will use 400+ watts because the 4090 is unconstrained. The GPU is running on all cylinders, putting out frames at a low enough speed that the CPU can keep up in these games.
Yup. If you click the Spoiler link in my post, you can see those are two of the highest power users.
 
So getting a 4090 for anything lower than 4K is not a balanced config...
Not sure where you got that. According to the data I quoted from Jarred, 1440p Ultra averaged 126.9 FPS on the 13-game geomean, and 104.3 FPS on the 6-game DXR geomean. That's average, but we didn't see the 99th percentile numbers, which are arguably more important.

At 4k Ultra, on the other hand, the DXR geomean average FPS drops to just 55.5. So, if those are the quality settings you want to use, I'd recommend sticking with 1440p.
 
Because averages represent everyone - just forget about those who only play certain titles/genres, the scenario in which averages could be quite a ways off.
Yes, and that's a general problem I have with averaging framerates across disparate games. However, it does show an overall picture where you cannot simply assume that even a GPU as powerful as RTX 4090 is able to handle 4k at max quality. And that caveat is quite valid.

BTW, in the article I was quoting my posts from, Jarred was averaging power across disparate games. That's an even more problematic exercise, because many games in his benchmark ran the GPU at >= 95% utilization (yes, even at 1440p), and clustered rather tightly in the low/mid 400 W range. However, a substantial number ran at lower utilization, those and dragged down the mean. The result is that someone might pick the 4090 based on a false expectation of what the power usage would be like, in many newer & more-demanding games.

I think what we're talking around is that a single number is simply inadequate for characterizing an entire population. At minimum, you'd need mean and variance, but that still wouldn't tell you the shape of the distribution. So, we should really like to have a histogram.
 
Last edited:
  • Like
Reactions: Phaaze88
Nah I'm sure Kombustor and 3DMark are so CPU limited that they don't fully stretched the 99% utilised legs of my 4090.

Very correct answers indeed.

Back to my original question..
I can list maybe 15+ games I've tested including 2033 redux, fallout 76, days gone, destiny 2, doom, overwatch, apex (uncapped), witcher 3 rtx update... ALL at QHD 1440p with a 5800X hitting 350w max generally speaking.

My system alone has ALOT of drives and things attached, it ain't no Green Friendly pc.

An MSI Kombuster Stress test sits my total power out at 480-520w (so like 550w in).

I'm on desktop, and yes I have some things updating in the background but my current using is 200w out. generally is 150w-200w idle. So we're talking at most 350w usage, at the utter most, I've seen no difference in usage going from a 6700XT to a 4090, and I see a hella' lot less than my 4090 did bouncing over 900w+ system out, jesus that was a monster.

My brother has a gaming x 4090 and says the exact same thing, he's gobsmacked and questioned what all the fuss was about, and if its an issue with his system, but his benchmark numbers are pretty much spot on.

I have to literally put my GPU at 3050+mhz to start getting near 600w, which still is way less than my 3090.

I can't belive how much of a deal everyones making, yet how much they suck up to the XTX when that guzzles way more power.

Honestly the whole "they run hot" or "they drink power" thing is way overdone, idiotic even.

What you're doing is probs testing way beyond normal gaming use and purposely pushing these cards well beyond normal limits.

I initially reinstalled windows, changed power cables, googled etc at why my GPU was using such little power, cos the media inc tom's has totally blown out of proportion the power usage.

And let me quote:-
"The 90% and 80% power limit both deliver similar performance, at least within the margin of error — 3% slower than stock, with power use of 368W and 346W, respectively. The 70% limit drops power use to just 309W, an 18% reduction versus stock, while performance is down 9%. The best setting for Fortnite, based on our testing, seems to be 60%: you get 91% of the base performance while using 29% less power."

OK, my System struggles to hit 370w TOTAL SYSTEM POWER so where'd these "lower power usage" numbers come from?

I'm guessing your numbers are "total system power draw" to which the Intel CPU would be why you're so thirsty, cos especially the top top CPUs are animals, but all Intel CPU's right now are dog efficiency, what you should have tested it with was a 5900X.

And this articles worth is questionable, if you're so worried about power usage you wouldn't be buying a 4090, and for some people who "claim" to own one and are going to reduce power usage to save on power, that's the most hilarious thing I've heard since NVIDIA's new pricing scheme.

Who on earth buys the worlds most expensive gaming GPU to sit and then restrict it's performance, making their price per perf even worse than it normally is, I'm amused. Really.
 
Last edited:
The GPU is running on all cylinders, putting out frames at a low enough speed that the CPU can keep up in these games.

Quite frankly that's the most irritating comment of late. The GPU is always "Running on all cylinders" what RTX titles do is also use RT Cores that are not used in general gaming.... they have been proven to be using much power, however my case rests when I said 3D Mark, which includes Port Royal.

Also as an FYI WoW does support RT and it does significantly drop FPS, something that wouldn't occur if it was simply the GPU not being fully used. Whilst we can sit and debate the pathetic indie dev performance and coding of wow, it doesn't particularly translate in to too much difference with other titles, RT on or OFF I am by no means ever seeing total system power beyond 500w., which desktop is using 150-200w on average depending on whats happening in the background, that's with the GPU not being used at all barely, 1% say.

As I've stated before the "lab" tests are utterly off.

This is like the entire idiotic comment saying the Ryzen 6 cores are better for gaming/gamers than the 8 cores, the 8 cores are such terrible value..

....BECAUSE these things are NEVER, EVER tested in NORMAL general gaming scenarios, simple put, there is no other background apps running, such as AV / IS / Anti Malware, discord, overlays, streaming, stuff like warcraft logs or for lol the team viewing stuff, rbg stuff, general crap like steam running and ea and everything else. (and on another level this is what makes the latest intel CPUs look way more appealing than anything amd).

Let me be perfectly clear right now, me, nor any other person i know with a 4090 can replicate the kind of power draws reviews like this are showing.... it's almost like AMD paid you to make an issue out of it (until their 7900xtx came across and make a 3090ti look efficient).
 
And let me quote:-
"The 90% and 80% power limit both deliver similar performance, at least within the margin of error — 3% slower than stock, with power use of 368W and 346W, respectively. The 70% limit drops power use to just 309W, an 18% reduction versus stock, while performance is down 9%. The best setting for Fortnite, based on our testing, seems to be 60%: you get 91% of the base performance while using 29% less power."

OK, my System struggles to hit 370w TOTAL SYSTEM POWER so where'd these "lower power usage" numbers come from?
So, you used the same games with the exact same settings? Also, exactly which model of card do you have, because this was a Founders Edition.

I'm guessing your numbers are "total system power draw"
Nope. The article clearly says:
"We capture all performance data using Nvidia's FrameView utility, which also collects GPU clock speeds, temperatures, and power consumption data."​

It goes on to say that those figures have been validated by their Powernetics meter to be accurate to within 10 W.

If you're serious about trying to reproduce their data, you should also use Nvidia's FrameView.

And this articles worth is questionable,
Disagree.

if you're so worried about power usage you wouldn't be buying a 4090,
Disagree. Some have rather expensive power or would like to reduce the amount of heat it puts out, in the summer. Because then, you're paying not only to generate the heat but also to remove it (i.e. via air conditioning).

Not only that, but the issues RTX 4090 had been having with halt and catch fire probably had some feeling uneasy, at the time this was published, and looking for ways to mitigate their risk.

Who on earth buys the worlds most expensive gaming GPU to sit and then restrict it's performance,
Because you can dial it back a fair bit, and still have the fastest GPU.

Most people owning a RTX 4090 fall pretty neatly into one of three categories:
  1. Those who have more money than ways to spend it, and just buy one basically because they can.
  2. Those who stretch their resources to buy one, because they really want it.
  3. Those who use it to generate income. These include 3D artists, AI researchers, and (until recently) crypto miners.

Only category #1 definitely doesn't care about power costs. They still might care about power, if they're even a little bit eco-minded. Category #2 very much cares about power, unless they're not the one paying the electric bill. #3 usually is looking to minimize costs, and that naturally includes electricity.

Speaking of costs, let's consider people in Texas or other hot places. Texas is usually one of the cheaper electricity markets, but they have variable pricing and it's not uncommon for people to spend several hundred $ per month just to air condition an apartment. Air conditioning, in those buildings, is about as essential as heating is, during the winters up north.

So, I can definitely understand people who'd rather turn down their heater fan GPU, at the expense of a few FPS, during those months. It makes sense to know what you're giving up, which brings us to the value of this article.
 
Last edited:
Quite frankly that's the most irritating comment of late.
Then run the FrameView utility and post your data to show it's running at high utilization. I don't care what you believe about your GPU, I just want the data. This whole argument is about data, so if you don't provide exactly comparable data your assertions are meaningless to me.

Let me be perfectly clear right now, me, nor any other person i know with a 4090 can replicate the kind of power draws reviews like this are showing....
That's funny, because it doesn't look to me like you really even tried. For instance you clearly didn't even bother to read the section titled "RTX 4090 Power Limiting Test Setup".

Provide test data with:
  • Same game(s)
  • Same settings (including, I'm going to guess, disabling V-sync. @JarredWaltonGPU ?)
  • Same measurement methodology
  • Full specs of relevant hardware & software
...or else I think we're done here.
 
Last edited:
  • Like
Reactions: JarredWaltonGPU
Then run the FrameView utility and post your data to show it's running at high utilization. I don't care what you believe about your GPU, I just want the data. This whole argument is about data, so if you don't provide exactly comparable data your assertions are meaningless to me.

That's funny, because it doesn't look to me like you really even tried. For instance you clearly didn't even bother to read the section titled "RTX 4090 Power Limiting Test Setup".

Provide test data with:
  • Same game(s)
  • Same settings (including, I'm going to guess, disabling V-sync. @JarredWaltonGPU ?)
  • Same measurement methodology
  • Full specs of relevant hardware & software
...or else I think we're done here.
Yes. And I should note that I'm seeing basically the same sort of power figures using a PCAT v2 with FrameView to capture actual in-line power consumption. I've shown those results in various reviews now, on the power testing section. For example, the AMD RX 7900 Series Launch has six tables, showing 7900 XTX, 7900 XT, 6950 XT, 4090, 4080, and 3090 Ti. The RTX 4070 Ti review has 11 table images showing the above six GPUs, along with the RTX 4070 Ti, 3070 Ti, 3080, 3080 Ti, and RX 6800 XT. Those are not total system power figures, they're real (averages over the benchmark run) data of power consumed by the graphics card, including both the PCIe slot and any 8-pin or 16-pin PEG connectors. All of my retesting of current and previous generation GPUs on the 13900K now collects this data, but I haven't finished with all the testing — RTX 30-series is done, but I need to test six or so AMD RX 6000-series cards still.

It's also not even remotely useful for him to say that he can't reproduce these power figures without actually listing full system specs, games tested, game settings, and test sequences. I could provide CSV files for every one of the results, showing not just averages but the full benchmark sequence with power spikes and troughs. To question the data without being willing to list basic specs and test setups is just dumb. My testing for example shows the exact system specs, which are basically a maxed out PC so that it will push the GPU as hard as possible. If you were to do the same tests with a Ryzen 9 5900X, I can guarantee power draw from the GPU would drop substantially, just because for gaming purposes, the 5900X is quite a bit slower than a 13900K.
 
  • Like
Reactions: bit_user
It's also not even remotely useful for him to say that he can't reproduce these power figures without actually listing full system specs, games tested, game settings, and test sequences.
Assuming someone would want to try and reproduce your data, where should they look to find the main details of your testing methodology? Things like whether to disable VSync (and if not, what monitor did you use?) and do the games have a specific benchmark mode and how long do you let it run? I assume our new friend has lost interest, but in case he or someone else wants to try.

If you were to do the same tests with a Ryzen 9 5900X, I can guarantee power draw from the GPU would drop substantially, just because for gaming purposes, the 5900X is quite a bit slower than a 13900K.
I was hoping that we could simply see the effect in the % utilization figures, without having to get onto another whole tangent about CPUs.
; )

Do you think the fact that you're using a Founders Edition is at all significant (i.e. if he's not)? Or is its power consumption comparable to the other 4090's?
 
Assuming someone would want to try and reproduce your data, where should they look to find the main details of your testing methodology? Things like whether to disable VSync (and if not, what monitor did you use?) and do the games have a specific benchmark mode and how long do you let it run? I assume our new friend has lost interest, but in case he or someone else wants to try.

I was hoping that we could simply see the effect in the % utilization figures, without having to get onto another whole tangent about CPUs.
; )

Do you think the fact that you're using a Founders Edition is at all significant (i.e. if he's not)? Or is its power consumption comparable to the other 4090's?
I tested six different RTX 4090 cards, the Founders Edition, PNY, and Gigabyte variants all had similar clocks, while the Asus, Colorful, and MSI models had slightly higher power limits. Vsync can generally be set to application preference in the drivers, but always make sure it's off in the games. Minecraft is an exception, it needs to be forced off in the drivers. I am currently testing with either an Acer Predator X27 (4K 144Hz G-Sync) or Samsung Neo G8 32 (4K 240Hz G-Sync Compatible, sometimes running at 120Hz because certain non-Nvidia cards don't like the 240Hz setting). I have confirmed in previous testing that the settings used don't change with the monitor, and also checked with a 4K 60Hz display at one point.

This article on how to test your graphics card contains instructions for a bunch of games. It's slightly outdated as it doesn't have newer games like Far Cry 6 listed, and in fact doesn't have very many games from my current test suite. But if anyone really wants to replicate our test sequences, I could share instructions somewhere (which I'd have to create so I'm not really planning on it right now). All the settings used are listed in the charts, the only games that use manual benchmarking (meaning, not a built-in benchmark) are Control, A Plague Tale: Requiem, and Spider-Man: Miles Morales. Flight Simulator is also a "manual" test, but it's just the landing activity in Iceland (under Epic challenges). I turn to the left a bit, then press "5" on the numpad to level off, then start logging frametimes and cut the engine to 20%... and the plane will "fly itself" for the next 40 seconds or so before crashing into the water.

Benchmark times (again, all using FrameView) are anywhere from 30 seconds (Control, MSFS, Plague Tale, Spider-Man) to as much as 250 seconds (RDR2), with most being in the 60–90 seconds range.
 
  • Like
Reactions: bit_user
I am currently testing with either an Acer Predator X27 (4K 144Hz G-Sync) or Samsung Neo G8 32 (4K 240Hz G-Sync Compatible, sometimes running at 120Hz because certain non-Nvidia cards don't like the 240Hz setting). I have confirmed in previous testing that the settings used don't change with the monitor, and also checked with a 4K 60Hz display at one point.
The question of monitors was only in case you weren't disabling V-Sync. With it disabled, then I wouldn't expect the monitor to affect anything (other than by virtue of its resolution and perhaps sometimes whether HDR is enabled).

Speaking of which, have you ever looked at the performance impact of enabling HDR?
 
The question of monitors was only in case you weren't disabling V-Sync. With it disabled, then I wouldn't expect the monitor to affect anything (other than by virtue of its resolution and perhaps sometimes whether HDR is enabled).

Speaking of which, have you ever looked at the performance impact of enabling HDR?
Yup. And I should note HDR is turned off, just to avoid that particular potential issue. :)
 
  • Like
Reactions: bit_user