10 years for the 2600k is something I would consider an anomaly in the industry due to the mining craze that hurt gaming which in turn hurt the demand and upgrade cycle of PCs as well as the absence of AMD as a serious threat to Intel. I don't think that scenario will repeat in the next 10 years.
The 2600k's overclocking capability has also helped it stay viable. The increase in maximum single thread performance in the last decade has been about 50%, so if overclocking gained you 20% more performance, you're able to keep up with almost half of all the top-end processors produced in the last decade--and that's quite a feat!
But think back to earlier generations like lga1366 and lga775. While there were some select chips that did really well at OC and were able to stay viable, none were produced en masse or as easy to OC as the 2600k (except maybe the q6600). Hence why very few of them were around 10 years after their introduction as people moved to the next generation.
History will teach you a lot about the future. With the upgrade cycle in full swing again, that means upgrades every 3 years to stay 'current'. You can skip a generation or maybe two depending on what you're using your system for (even much more if you don't tax it much), but software will also dictate upgrades even if the hardware is still capable--and that's the current game--forced upgrades to generate revenue. Of course, with intense applications and use, only the upper crust will be fast enough and that's been a never ending target since the 1990s.
I built some very expensive systems in the 1990s that were intended to last for many years. This was a time when applications didn't change much, gaming was in its infancy and you didn't 'need' OS upgrades or patching continuously. Even then, we found that we had 'opportunity cost' by not upgrading simply because things could run faster and we could get our work done sooner. It didn't make sense to not upgrade when time was money. Once the bloat began in operating systems with windows 95, it just took off from there and that brings us to today. Even corporations bringing up the topic of TCO (total cost of ownership) didn't really change the acceleration too much.
I think as AMD continues to dominate the single thread and value space in the market, Intel is working on a 'one-up' solution--which AMD will then counter with and so forth. And as long as developers can truly take advantage of these newer developments (like ray tracing for example), you'll see fairly rapid market adoption and then commonplace. And once that happens, an upgrade will be necessary for sure. And then the game continues.