Question Why not keep fans off until 55 C =120F?

May 18, 2021
37
1
35
0
If fans are off below 55 C they are quiet, last longer, and use less energy. It is my understanding that temps below 60C are harmless to CPU's and GPU's so why not leave the fans off at lower temps? I've been fine tuning my fan curves using Argus Monitor and I'm real happy with the results. My computer idles between 30 & 55 C depending on the ambient room temp so the fans will stay off until I play a game or edit photos. Argus Monitor has a hysteresis setting that keeps the fans from throttling.
 
May 18, 2021
37
1
35
0
It depends whether your computer allows CPU fan to be stopped or not, most of computers won't start without the CPU fan spinning.
When my computer boots up the BIOS has control of my fans so it starts up fine and then Argus takes control of the fans.
 
From what I heard from electrical engineers, the lifespan of electronics cuts in half for every 10C you subject to it. Of course this requires knowing what the lifespan of said part is for this figure to matter. For the sake of argument though, let's say I can get 5 years out of my CPU as-is, with a 50%/50% idle/load split to make things easy. And let's say the average idle temp is 35C. Assuming my math is right, having the CPU idle at 55C will drop the expected life span down to about 3.125 years.

Depending on how long your upgrade cycle is, running the fans to maintain lower temperatures is cheaper overall by virtue that if the fans break, it's a cheaper part to replace than if the CPU degrades out of spec and you have to replace it.

And if saving energy is your goal, turning off the fans is basically trimming the muscle. There are fattier things to cut since fans take like <3W apiece on full blast.

EDIT: Also if you get a quality fan, they last a very long time anyway. Noctua rates their fans for 150,000 hours, which is 17 years if you ran it 24/7.
 
Last edited:
May 18, 2021
37
1
35
0
From what I heard from electrical engineers, the lifespan of electronics cuts in half for every 10C you subject to it. Of course this requires knowing what the lifespan of said part is for this figure to matter. For the sake of argument though, let's say I can get 5 years out of my CPU as-is, with a 50%/50% idle/load split to make things easy. And let's say the average idle temp is 35C. Assuming my math is right, having the CPU idle at 55C will drop the expected life span down to about 3.125 years.

Depending on how long your upgrade cycle is, running the fans to maintain lower temperatures is cheaper overall by virtue that if the fans break, it's a cheaper part to replace than if the CPU degrades out of spec and you have to replace it.

And if saving energy is your goal, turning off the fans is basically trimming the muscle. There are fattier things to cut since fans take like <3W apiece on full blast.

EDIT: Also if you get a quality fan, they last a very long time anyway. Noctua rates their fans for 150,000 hours, which is 17 years if you ran it 24/7.

Thanks for the info. I'm sure heat degrades electronics but another missing data point is, at what temp does degradation of IC's begin? For example, I doubt if they degrade at room temperature and I doubt if it degrades at 100 F. 105 C might destroy an IC in seconds. Based on this I believe the curve of temp vs degradation is not linear.

Here is a long article about: Does a 10°C Increase in Temperature Really Reduce the Life of Electronics by Half? The mostly debunk that claim except they acknowledge that heat does degrade electronics at some point. I would love to see any data from a study of IC temp vs degradation. There should be a lot of data about that but I couldn't find any during a quick search.

I read that MSI GPUs have an option to keep the fans off until the chips reach 60 C but I haven't seen that mentioned in their spec.
 
Last edited:
Thanks for the info. I sure heat degrades electronics but another missing data point is, at what temp does degradation of IC's begin? For example, I doubt if they degrade at room temperature and I doubt if it degrades at 100 F. 105 C might destroy an IC in seconds. Based on this I believe the curve of temp vs degradation is not linear.

Here is a long article about: Does a 10°C Increase in Temperature Really Reduce the Life of Electronics by Half? The mostly debunk that claim except they aknowledge that heat does degrade electronics at some point. I would love to see any data from a study of IC temp vs degradation. There should be a lot of data about that but I couldn't find any during a quick search.
It's also important to understand where temperature factors in such failure modes, as noted in the article you posted:

In general, the Arrhenius model is likely appropriate for certain failure mechanisms including corrosion, electromigration and certain manufacturing defects [1], but is not suitable for other significant failure modes, such as the formation of conductive filaments, contact interface stress relaxation, and fatigue of packageto-board level interconnect [5]
But yes, and I acknowledged it in my post, we don't have enough data to make any real informed decision. But given the information that we currently have, I'd rather base it around what's available now, and the three failure modes affected by temperature in the quote are still significant enough factors for me to believe that temperature shouldn't be that high if you can help it.

And also your reasoning for keeping parts at a higher temperature to save fans doesn't make sense to me because again, fans are cheaper to replace and if you get a quality fan they're rated last a really long time anyway.

EDIT: Also my concern for my use case isn't so much outright failure. It's that I tweaked with my processor to clock faster and do it at the lowest possible voltage. I'm essentially asking be at the edges of its limits. Higher temperatures will degrade the silicon to a point where I can no longer keep the settings I want. Once the processor is no longer stable, I have to either drop the frequencies, bump up the voltage, or both.
 
Last edited:
May 18, 2021
37
1
35
0
Thanks again for the input.

I was just surprised with some new temp data fresh my playing around with my computer fans. Fans at 25% can reduce CPU and GPU temps 10C!
With my GPU fans off, at idle, my GPU was 53 C.
With my GPU fans on at 100%, at idle, my GPU was 43 C.
With my GPU fans on at 25%, at idle, my GPU was still 43 C.
Based on that, I might as well have the fans always on, start at 25% and ramp up after 40 C.

The CPU fans dropped from 50C to 41C when I turned the CPU and output fans on to only 20%, so I might as well have those fans on at 20% minimum and ramp up from there.

Below 50C the front case fans don't make much difference so I will leave them off until the GPU or CPU gets to 55 C. The rear output fan is in line with the CPU fans so its speed is identical to the CPU fans.

It has been fun doing these tests with Argus Monitor.
 

ASK THE COMMUNITY