I already asked this question on r/undervolt subreddit and LTT Forum. But I haven't got any proper answer.
I know what undervolting is and the other basic stuff. I undervolted my CPU with Intel XTU. It was straightforward (just setting the voltage offset). I wanted to do the same with my GPU. Most of the guides I read/tutorials I watched were using MSI Afterburner. The process (as you may already know) involves adjusting a curve on a graph. Nobody explained how it would affect the voltage of the GPU. X was voltage and Y was frequency. I understand it represents the GPU will run at "y" MHz at "x" mV and x is variable at a given time. But I always thought that a constant voltage is applied through a digital circuit. That's how I understood CPU undervolting. It always runs at X volts with variable frequency at a given time and adjusted it run at (X - a) volts to reduce the heat output. This is why GPU undervolt doesn't make sense to me. OK. Let's say that the circuit is running at variable voltage. But the process only involves changing the frequency (Y) value. How does it change the overall voltage applied through the GPU? Doesn't changing the frequency affect the performance of the GPU? Is there anything that I'm missing about basic electronic physics?
Also Is there any "one-click" undervolt tool for GPUs, like XTU? All I had to do there was set the offset.
Thanks in advance.
I know what undervolting is and the other basic stuff. I undervolted my CPU with Intel XTU. It was straightforward (just setting the voltage offset). I wanted to do the same with my GPU. Most of the guides I read/tutorials I watched were using MSI Afterburner. The process (as you may already know) involves adjusting a curve on a graph. Nobody explained how it would affect the voltage of the GPU. X was voltage and Y was frequency. I understand it represents the GPU will run at "y" MHz at "x" mV and x is variable at a given time. But I always thought that a constant voltage is applied through a digital circuit. That's how I understood CPU undervolting. It always runs at X volts with variable frequency at a given time and adjusted it run at (X - a) volts to reduce the heat output. This is why GPU undervolt doesn't make sense to me. OK. Let's say that the circuit is running at variable voltage. But the process only involves changing the frequency (Y) value. How does it change the overall voltage applied through the GPU? Doesn't changing the frequency affect the performance of the GPU? Is there anything that I'm missing about basic electronic physics?
Also Is there any "one-click" undervolt tool for GPUs, like XTU? All I had to do there was set the offset.
Thanks in advance.