Question Why boosting my GPU's clock, max clock is improved but current clock is hardly changed?

Aug 26, 2022
13
1
15
I'm trying to boost my 3090's clock using GreenWithEnvy (Linux) app. I can boost the GPU clock +200 MHz and see that the "max clock" reflects the boost. But when I put some load on the device, the actual (current) clock hardly goes beyond 1900! Why is that?

Here's a screenshot of my settings:

GWE.png
 
Aug 26, 2022
13
1
15
And what would that be? I mean what are my options?

On the topic of "the only two things", I can see that I have the option to change the power limit (in the screenshot shown above, it's already at maximum). But I don't see any interface to change the voltage. Perhaps GWE does not support that? BTW, increasing the power limit hardly improves the performance. In fact, it damages the accuracy of the results (I'm talking about Machine Learning accuracy).
 
And what would that be? I mean what are my options?

On the topic of "the only two things", I can see that I have the option to change the power limit (in the screenshot shown above, it's already at maximum). But I don't see any interface to change the voltage. Perhaps GWE does not support that? BTW, increasing the power limit hardly improves the performance. In fact, it damages the accuracy of the results (I'm talking about Machine Learning accuracy).
So here's the thing about how all this works:
  • Setting a frequency offset doesn't actually tell the GPU it can boost past whatever clock speed it was advertised to boost at. All GPUs since NVIDIA Pascal and AMD GCN5 basically will boost to infinity until one of the following performance limiters are hit (this is taken from a screenshot of HWiNFO)
    yG1gdD2.png


    Though this particular set only applies to NVIDIA GPUs. I'm pretty sure AMD GPUs have something similar.

  • What setting a frequency offset does is it adjusts all the points in the GPU's Voltage/Frequency curve. This is basically a set of points that, for whatever frequency the GPU wants to go to, set the voltage to that point. Using MSI Afterburner, the line below the one with the dots is the default V-F curve. When I set an offset of 100+ MHz or so, all of the points lift up. This is actually undervolting the GPU because you're asking the GPU to be driven faster with less voltage than the default.

    So if at 2000 MHz the default is to set the voltage to 1V, adding a 100MHz offset means the GPU will try to get to 2100MHz on 1V. Normally the voltage should also go up a bit to help with stability

    ftSUaml.png


  • However there appears to be a soft voltage limit. Playing around with this, my card refuses to apply more than 1.1V by default, despite what the curve shows. If I set Core Frequency offset of -100MHz, the card caps out at 2700MHz. But if I add enough voltage (about 1.12V on the V-F curve), it bumps up to the next frequency bin
    YVt4K48.png


  • So if you want to push your card's maximum boosting potential, leave the Core Frequency Offset alone and adjust the voltage. Once you found a point you're comfortable with, then start pushing the Core Frequency Offset. Although given there's a "Max operating voltage" performance limiter, there's a good chance the GPU won't let you actually set a voltage that'll fry it, but I'm not willing to test this on my hardware.

    Also by default, MSI Afterburner doesn't enable voltage control. You have to go out of your way to enable it. This was for safety reasons. Your tool may have something similar in place.

  • But since you mentioned accuracy was a factor, here's my advice: don't overclock it. In fact, cap it at the base clock speed. Why? Overclocking can introduce errors, leading to incorrect results. There's a reason why professional grade cards run at a lower clock speed than their gaming counterparts.
EDIT: I found out that Ada based GPUs (RTX 40 series) apparently have a hard limit of 1.1V and supposedly you can't go past it.

So either my RTX 4070 Ti can, the software's lying to me, or the hard limit is more like up to 1.19V
 
Last edited:
Aug 26, 2022
13
1
15
Thanks for all the info, @hotaru.hino . As I'm using Linux, I don't have access to the same software as you do. And it seems like GWE does not let me change voltage. In fact, it does not even report it. Maybe it's a shortcoming of the Nvidia's Linux driver. But still, it's good to know how to work with the voltage. Thanks again.