Well, no.
But in any appreciable user timeline...they don't.
Ahh...so we've established that much at least
But I think overclocking the snot out of a processor can have the effect of shortening that timeline WAYYYyyyy down for it as I've actually had it happen to me. I'd overclocked a 386sx-25 to something like 50Mhz (don't take it for gospel...i really can't remember THAT detail). It ran my d-base indexing tasks for several months but then it started throwing errors in the indexes and finally crashing. Processors ran wide-open all the time in those days; it would crash just sitting at the DOS prompt (I'd go to type a command and nothing would happen, not even an echo-back of keystrokes). I opened it up and moved some jumpers around to get clock back down and it booted and ran fine then. I may be naive, but I'd say I had 'degraded' my processor.
I managed to swing budget for an upgrade soon after but who's to say, after my abuse, it just might have continued right on...ummm, degrading let's say...until the poor sot who got it gave up and tossed it into the trash.
Whether that is happening to OP or not I can't say...it might also depend on how Intel processor's work internally. If 'degrading' should take the form of not boosting as much as it did before then performance could suffer without the crashes and erroring I had. I don't know that's his case, I'm just saying I see a way for processors to 'degrade' without waiting for the universe to cool down to absolute zero.