Would capping your framerate to 60 save energy?

Myronazz

Distinguished
Sep 5, 2016
326
12
18,795
So...

I just had this little theory in my head
i noticed when i have my computer generate as many frames as possible the power supply's fan runs faster which means the computer is now consuming more energy
While when i cap my framerate to 60 i can hear the power supply's fan slowing down and making less noise which means the computer is consuming less power to make 60 frames instead of 260

So it would cost less to power your computer if you cap your framerate. of course that's just a theory that came up in my head randomly, i could easily be wrong

But it's always good to cap your framerate, what's the point of having 260 frames per second when your monitor can only display 60 frames per second, plus your computer strugles less to generate 60 frames and the less your computer strugless, it's good for it's health because of less heat

So am i correct?
 
Solution

I suspect the power savings would be pretty substantial going from 260 fps to 60 fps. CPU/GPU power consumption isn't linear. Running them at twice the speed doesn't consume twice the power. It consumes more like 4x the power. There's more leakage at higher clock speeds, and the transistors need a higher voltage to remain stable.

However, while the difference is large in terms of Watts, it probably isn't that big in terms of dollars. Average electricity price in the U.S. is 12 cents/kWh. So if you manage to lower your GPU's power consumption from 150 Watts to 50 Watts...


To a degree. You've got the general gist of the matter, but the real power savings by using an FPS limiter are going to be pretty negligible in most cases.
 

I suspect the power savings would be pretty substantial going from 260 fps to 60 fps. CPU/GPU power consumption isn't linear. Running them at twice the speed doesn't consume twice the power. It consumes more like 4x the power. There's more leakage at higher clock speeds, and the transistors need a higher voltage to remain stable.

However, while the difference is large in terms of Watts, it probably isn't that big in terms of dollars. Average electricity price in the U.S. is 12 cents/kWh. So if you manage to lower your GPU's power consumption from 150 Watts to 50 Watts, the savings of 100 Watts translates into 1.2 cents saved per hour of gaming.

Not that big a deal unless you're talking about annually. If you're an online gaming addict and spend 3000 hours/yr playing (a bit more than 8 hours per day), then it'll add up to a savings of $36 for the year. If you're looking to save electricity, you're usually better off reducing the power consumption of devices left on 24/7.
 
Solution


While this is true, the FPS difference won't be enough in most games, unless the GPU is poorly matched for the game. How often, when playing a AAA game is a 60 fps cap really going to have *that* much effect on the framerate? It'd be pretty silly to pay extra in the first place for a card capable of doing 120 fps on AAA games for the purpose of capping it at 60 to save power/heat.

There's the *potential* for significant heat/power savings, but you'd need a nearly perfect storm of circumstances.