Evidence for Temp damaging CPU??

ac3144

Distinguished
Apr 27, 2009
281
0
18,790
Hi Guys,
Is there any firm evidence (not anecdotal) that running your CPU close to max settings stated by the manufacturer results in reduced CPU life??

I understand (on a basic level) about electromitigation (EM) but have any real studies been completed that examine the temp/cpu life relationship?

People talk about EM and it may be that I just don't know enough about it but from my reading it seems perhaps it is largely theoretical or likely to come from stressing chips with heat in the short term, rather than, running them hot but not ridiculously hot for a longer time, say a year or so.

I am happy to be educated via links etc and also happy to gauge anecdotal evidence on individual experience. I think this is an important topic for overclocking and I would like to find out more

Thanks for the replies
AC
 

randomizer

Champion
Moderator
Personal experience is hard to give when most people wouldn't run a CPU hot 24/7 until it died. Because this is likely to take years to happen (in theory, assuming a CPU that wasn't prone to failure already), even starting now would mean that by the time you had some data the CPUs you were working with would be obsolete, and the results may only be partially relevant.
 

ac3144

Distinguished
Apr 27, 2009
281
0
18,790
Exactly! So where does the notion that running your cpu near it's max temp will reduce life, come from?? Who has done the studies?? This is what I'm trying to find out.
Keep thoughts coming.. :)
AC
 

ac3144

Distinguished
Apr 27, 2009
281
0
18,790
So anyone ever had a chip fail that they are pretty sure was down to overclocking?? If so, over what time period?

In the meantime, I am going to email the computer science boffins at my University (Bristol in the UK) and see if they can point me to any literature out there about this.

AC
 
Well, electromigration is an established fact. But circuit elements used to be large enough and CPU life cycles short enough that I think a CPU was obsolete before it could fail because of electromigration.

In modern CPU's, the gate insulators can be as little as 10 atoms thick, so failures should be more common now. The other side of that is that Intel's published "do not exceed" CPU voltages and temperatures seem to be pretty conservative. And, by and large, the OC community seems to be pretty careful about not exceeding the limits.