"Some factors increase the rate of wear and trigger electromigration (the process of electrons slipping through the electrical pathways) faster, such as higher current and thermal density. Because increasing frequency requires pumping more power through the chip, thus generating more heat, higher frequencies typically result in faster aging, and thus lowered life span. These problems become more pronounced with smaller feature sizes, such as when transistors become smaller inside modern chips (like AMD's shrink to a 7nm process and Intel's shrink to 10nm), simply because the chip is pushing more current through smaller transistors and interconnects.
So, like the carton of milk in your refrigerator, your chip has an expiration date. It's the job of smart semiconductor engineers to predict that expiration date and control it with some accuracy, which is a difficult proposition given the unique characteristics of each and every individual piece of silicon that comes out of the fab. Given that switching the transistors at higher frequencies and higher temperatures increases the rate of wear on it and the surrounding structures, this is one of the primary levers that engineers pull to control the lifespan of your chip.
In short, reducing frequency can slow the aging process, thus increasing longevity."
This sounds like technobabble to me.