hannibal
Distinguished
80% looks like a sweet spot
I agree 80% to 70% the drop in speed is not big and power usage goes down nicely!
80% looks like a sweet spot
Opposite is true, bigger chip draws more power as a baselineIf you care more about power than up-front costs, you can do better by getting a larger GPU and just restricting it to your desired power budget.
I meant within limits, but I guarantee you can get more performance out of a 4090 at 200 W than a 4080 at 200 W.Opposite is true, bigger chip draws more power as a baseline
Power Ratio | Board Power (W) | Mean FPS | 99% FPS | Mean FPS/W | 99% FPS/W |
---|---|---|---|---|---|
1.2 | 425 | 64.8 | 77.6 | 0.15 | 0.18 |
1.1 | 410 | 64.7 | 77.3 | 0.16 | 0.19 |
1.0 | 390 | 64.9 | 76.9 | 0.17 | 0.20 |
0.9 | 370 | 64.3 | 76.3 | 0.17 | 0.21 |
0.8 | 340 | 63.2 | 75.1 | 0.19 | 0.22 |
0.7 | 310 | 61.9 | 73.1 | 0.20 | 0.24 |
0.6 | 260 | 58.4 | 68.9 | 0.22 | 0.27 |
0.5 | 210 | 50 | 59.3 | 0.24 | 0.28 |
In the era of cut-throat benchmark competition, it's unrealistic to expect manufacturers to do this of their own accord.
I think a good first move would be some kind of energy labeling regulation that requires the energy usage of the default config to be accurately characterized. Similar to how we insist automobiles advertise their fuel-efficiency, according to a prescribed measurement methodology.
Yes, try "MSI Afterburner". Works for all cards, not just MSI ones.Could someone give me a hint on how I can limit the power of my Nvidia raphics card, please? Is there a tool for this?
Nah, it's not lies and misinformation. It is CPU limitations.Why are people and reviewers spreading lies and mis-information about the RTX 4090's power usage?
Total System Power @1440p ULTRA on Battlefield 2042 is less than 400w. With DLSS and RT on, it's actually between 300-400w, uncapped.
World of Warcraft Ultra + LiveStreaming using NVENC - 250-300w.
BF2042 on 3 difff 4090s in 3 diff systems.
Please tell me where you get these insane numbers from! Not even past 3050mhz can I get beyond 500w total system power!
That wasn't one of the games he tested. Nor was WoW Ultra. In another article examining the RTX 4090's FPS/W, I found some games that underutilized the GPU also used significantly less power. Maybe you've stumbled upon two such titles? What was the average % utilization for those games?Why are people and reviewers spreading lies and mis-information about the RTX 4090's power usage?
Total System Power @1440p ULTRA on Battlefield 2042 is less than 400w.
Aside from some games using less power than others, did you look at the driver versions?Please tell me where you get these insane numbers from!
Thanks! That lines up with my conjecture. We'd know for sure, if we could get average % utilization figures, on the RTX 4090, for those games.It is CPU limitations.
The GPU can only go as far as the CPU does. In the games you mention, the CPU bottlenecks the performance before a 4090 (or even 3090, as Jarred reported in his benchmarks for BF2042) can fully stretch its legs. WoW isn't too graphically demanding, so it is hitting the CPU's limit long before the GPU can max itself out,
Yup. If you click the Spoiler link in my post, you can see those are two of the highest power users.Games like Cyberpunk 2077 or Metro Exodus Enhanced, that have the absolute-most demanding graphics and use multiple types of ray-tracing, and running at 4K resolution, will use 400+ watts because the 4090 is unconstrained. The GPU is running on all cylinders, putting out frames at a low enough speed that the CPU can keep up in these games.
Not sure where you got that. According to the data I quoted from Jarred, 1440p Ultra averaged 126.9 FPS on the 13-game geomean, and 104.3 FPS on the 6-game DXR geomean. That's average, but we didn't see the 99th percentile numbers, which are arguably more important.So getting a 4090 for anything lower than 4K is not a balanced config...
Yes, and that's a general problem I have with averaging framerates across disparate games. However, it does show an overall picture where you cannot simply assume that even a GPU as powerful as RTX 4090 is able to handle 4k at max quality. And that caveat is quite valid.Because averages represent everyone - just forget about those who only play certain titles/genres, the scenario in which averages could be quite a ways off.
The GPU is running on all cylinders, putting out frames at a low enough speed that the CPU can keep up in these games.
So, you used the same games with the exact same settings? Also, exactly which model of card do you have, because this was a Founders Edition.And let me quote:-
"The 90% and 80% power limit both deliver similar performance, at least within the margin of error — 3% slower than stock, with power use of 368W and 346W, respectively. The 70% limit drops power use to just 309W, an 18% reduction versus stock, while performance is down 9%. The best setting for Fortnite, based on our testing, seems to be 60%: you get 91% of the base performance while using 29% less power."
OK, my System struggles to hit 370w TOTAL SYSTEM POWER so where'd these "lower power usage" numbers come from?
Nope. The article clearly says:I'm guessing your numbers are "total system power draw"
Disagree.And this articles worth is questionable,
Disagree. Some have rather expensive power or would like to reduce the amount of heat it puts out, in the summer. Because then, you're paying not only to generate the heat but also to remove it (i.e. via air conditioning).if you're so worried about power usage you wouldn't be buying a 4090,
Because you can dial it back a fair bit, and still have the fastest GPU.Who on earth buys the worlds most expensive gaming GPU to sit and then restrict it's performance,
Then run the FrameView utility and post your data to show it's running at high utilization. I don't care what you believe about your GPU, I just want the data. This whole argument is about data, so if you don't provide exactly comparable data your assertions are meaningless to me.Quite frankly that's the most irritating comment of late.
That's funny, because it doesn't look to me like you really even tried. For instance you clearly didn't even bother to read the section titled "RTX 4090 Power Limiting Test Setup".Let me be perfectly clear right now, me, nor any other person i know with a 4090 can replicate the kind of power draws reviews like this are showing....
Yes. And I should note that I'm seeing basically the same sort of power figures using a PCAT v2 with FrameView to capture actual in-line power consumption. I've shown those results in various reviews now, on the power testing section. For example, the AMD RX 7900 Series Launch has six tables, showing 7900 XTX, 7900 XT, 6950 XT, 4090, 4080, and 3090 Ti. The RTX 4070 Ti review has 11 table images showing the above six GPUs, along with the RTX 4070 Ti, 3070 Ti, 3080, 3080 Ti, and RX 6800 XT. Those are not total system power figures, they're real (averages over the benchmark run) data of power consumed by the graphics card, including both the PCIe slot and any 8-pin or 16-pin PEG connectors. All of my retesting of current and previous generation GPUs on the 13900K now collects this data, but I haven't finished with all the testing — RTX 30-series is done, but I need to test six or so AMD RX 6000-series cards still.Then run the FrameView utility and post your data to show it's running at high utilization. I don't care what you believe about your GPU, I just want the data. This whole argument is about data, so if you don't provide exactly comparable data your assertions are meaningless to me.
That's funny, because it doesn't look to me like you really even tried. For instance you clearly didn't even bother to read the section titled "RTX 4090 Power Limiting Test Setup".
Provide test data with:
...or else I think we're done here.
- Same game(s)
- Same settings (including, I'm going to guess, disabling V-sync. @JarredWaltonGPU ?)
- Same measurement methodology
- Full specs of relevant hardware & software
Assuming someone would want to try and reproduce your data, where should they look to find the main details of your testing methodology? Things like whether to disable VSync (and if not, what monitor did you use?) and do the games have a specific benchmark mode and how long do you let it run? I assume our new friend has lost interest, but in case he or someone else wants to try.It's also not even remotely useful for him to say that he can't reproduce these power figures without actually listing full system specs, games tested, game settings, and test sequences.
I was hoping that we could simply see the effect in the % utilization figures, without having to get onto another whole tangent about CPUs.If you were to do the same tests with a Ryzen 9 5900X, I can guarantee power draw from the GPU would drop substantially, just because for gaming purposes, the 5900X is quite a bit slower than a 13900K.
I tested six different RTX 4090 cards, the Founders Edition, PNY, and Gigabyte variants all had similar clocks, while the Asus, Colorful, and MSI models had slightly higher power limits. Vsync can generally be set to application preference in the drivers, but always make sure it's off in the games. Minecraft is an exception, it needs to be forced off in the drivers. I am currently testing with either an Acer Predator X27 (4K 144Hz G-Sync) or Samsung Neo G8 32 (4K 240Hz G-Sync Compatible, sometimes running at 120Hz because certain non-Nvidia cards don't like the 240Hz setting). I have confirmed in previous testing that the settings used don't change with the monitor, and also checked with a 4K 60Hz display at one point.Assuming someone would want to try and reproduce your data, where should they look to find the main details of your testing methodology? Things like whether to disable VSync (and if not, what monitor did you use?) and do the games have a specific benchmark mode and how long do you let it run? I assume our new friend has lost interest, but in case he or someone else wants to try.
I was hoping that we could simply see the effect in the % utilization figures, without having to get onto another whole tangent about CPUs.
; )
Do you think the fact that you're using a Founders Edition is at all significant (i.e. if he's not)? Or is its power consumption comparable to the other 4090's?
The question of monitors was only in case you weren't disabling V-Sync. With it disabled, then I wouldn't expect the monitor to affect anything (other than by virtue of its resolution and perhaps sometimes whether HDR is enabled).I am currently testing with either an Acer Predator X27 (4K 144Hz G-Sync) or Samsung Neo G8 32 (4K 240Hz G-Sync Compatible, sometimes running at 120Hz because certain non-Nvidia cards don't like the 240Hz setting). I have confirmed in previous testing that the settings used don't change with the monitor, and also checked with a 4K 60Hz display at one point.
Yup. And I should note HDR is turned off, just to avoid that particular potential issue.The question of monitors was only in case you weren't disabling V-Sync. With it disabled, then I wouldn't expect the monitor to affect anything (other than by virtue of its resolution and perhaps sometimes whether HDR is enabled).
Speaking of which, have you ever looked at the performance impact of enabling HDR?