Question Will reduced FPS save power?

gabecz

Reputable
Jan 21, 2022
60
3
4,535
I have a cool setup now. 12700KF 32gb ddr5 @7200mhz, and an rx 9070
my tv has 144mhz max.
i used to 30fps so having 100+ doesn't do much for me or not much anyways i guess. also i read somewhere that above 80 is just an unnecessary fancyness for you don't really "see" more physically.
my question is if i reduce the maximum fps in videogames to like 60hz will i save on power bill?
i did not get my first bill since the new build but i'm genuinely scared haha jk but for real it will definitely be more than it was with my 8700k + rtx 3070
i can do 2 things listen to you pros, or test it out. latter of course would take 2 months. although i have access to my daily usage online.
 
my question is if i reduce the maximum fps in videogames to like 60hz will i save on power bill?
Negligible difference.

You'd have more diff in more efficient PSU. E.g if you're running 80+ Gold, then 80+ Titanium would give a diff at any loads (as long as PC is powered on).

or test it out. latter of course would take 2 months.
2 months? :mouais: I have a way to test it in 5 mins.

Download and run HWinfo64,
link: https://www.hwinfo.com/download/

In Sensors mode, HWinfo64 reports power draw of your GPU. "GPU Power" is it's name.
Run your games at full tilt, while keeping HWinfo64 running on backgound. Afterwards note down the Maximum wattage you see on GPU power. You can note down the Average too if you like.

Then, close the HWinfo64 and relaunch it, so that it's min/max are cleared. Cap your in-game FPS to 30 or 60 and game again. Afterwards look if your GPU did actually consume less power or not.
 
my tv has 144mhz max.
i used to 30fps so having 100+ doesn't do much for me or not much anyways i guess. also i read somewhere that above 80 is just an unnecessary fancyness for you don't really "see" more physically.
It's about how your monitor displays stuff, how much lag and ghosting there will be, also at low FPS if you turn your camera around there a big gaps in what is being displayed.
You can see a side by side here.
https://www.testufo.com/

As far as power goes yes, reducing FPS will reduce power draw but it might not change anything for your power bill.
As said above, find out how much power it's drawing now first.
 
I have a cool setup now. 12700KF 32gb ddr5 @7200mhz, and an rx 9070
my tv has 144mhz max.
i used to 30fps so having 100+ doesn't do much for me or not much anyways i guess. also i read somewhere that above 80 is just an unnecessary fancyness for you don't really "see" more physically.
my question is if i reduce the maximum fps in videogames to like 60hz will i save on power bill?
i did not get my first bill since the new build but i'm genuinely scared haha jk but for real it will definitely be more than it was with my 8700k + rtx 3070
i can do 2 things listen to you pros, or test it out. latter of course would take 2 months. although i have access to my daily usage online.
It depends how much you use it and how much much it actually draws. You can undervolt your GPU and reduce frames and you will save some power but performance to power consumption isn’t linear. By dropping your utilisation to 60% instead of 100 you could be saving 20W you could be saving 70W. However to save 1KWH you’re talking 14 hours of gameplay at a 70W saving.
 
i did not get my first bill since the new build but i'm genuinely scared
What other things use electricity that you pay for?

The bulk of my electricity bill comes from water heaters, cooker, kettle, microwave, fridge/freezer, washing machine, central heating fan, etc.

A few kWhr to run my main computer that maxes out at 400W is chicken feed in comparison, even at the equivalent of US $0.33 per kWhr.
 
What other things use electricity that you pay for?

The bulk of my electricity bill comes from water heaters, cooker, kettle, microwave, fridge/freezer, washing machine, central heating fan, etc.

A few kWhr to run my main computer that maxes out at 400W is chicken feed in comparison, even at the equivalent of US $0.33 per kWhr.
i really have none of those 😀 it's a 1br. i have 1 fridge with small top freezer. i don't use my ac ever. i have "free" hot water. i cook with gas daily so i use microwave maybe 2x a month. i watch tv and play games all day. with a 65qn90daf and an denon receiver. so all in all the pc is the main consumer of the household, but reading all the comments, it's A) too much hassle to even check the theory, and B) i really really did sound cheap with my question. it's not really an issue having $90 power bill instead of my current ~$45 i guess i was more curious then wanting to save 2 drinks worth of money a month.
but thank you for the insights i will play around with the suggested tools anyways. it's good to see how things work and test stuff.
also after just googling 8700k & 12700kf and rtx 3070 & rx 9070 tdps the total difference is 50W... and that is close to nothing. and that's when both go on full blast where i rarely see the cpu going on 100% while playing with most of my games.
thanks agian for the suggestions and insight
 
I'd hazard a guess the 65QN90DAF will use around 140W and the Denon receiver (when pushed really hard) could consume another 100W+ if you've got the volume up loud enough on speakers to annoy the neighbours.
https://www.displayspecifications.com/en/model-power-consumption/2f404081

I'm using a Kenwood receiver to drive my 12In woofers, 5in mid-range and 1in tweeters in the computer room.

Get a power meter and use it to calculate energy cost per day.
https://www.amazon.com/Electricity-Electrical-Consumption-Backlight-Protection/dp/B09BQNYMMM