Imagine if you will two identical CPUs. I don't have any specific CPUs in mind, but let's just call them CPU A and CPU B. They both were installed in systems with the same exact configuration (bar cooling). Both CPUs are used for rendering work, day in and day out, the CPU could spend days or even weeks running heavy loads flatout at 100%....
But here's the difference. CPU A were given adequate (probably too much) cooling. It was strapped to a massive heatsink, high-performance fans, tiny fans to cool the motherboard's VRM, and huge case fans. CPU A spent most of its life hovering at around 60 degrees C. Meanwhile, CPU B were given a more subpar cooling. The cooling was just enough to keep it from throttling or hitting TJ Max, but the CPU still ran much hotter at 90 degrees C with the same load.
Now here's the question. After, say, 5 years of heavy usage, would there be any tangible difference to the performance of those CPUs?? By performance I meant power usage or stability, not just outright "number-crunching" capabilities.
I never seen a CPU fail in my life, but I've heard that a CPU that has been worked hard at high temperature for prolonged period of time might require more power to achieve the same performance and stability, and thus runs hotter. If I were given both CPU A and CPU B 5 years down the line, would I be able to tell the difference between the two? I mean, do CPU really deteriorate with age??
I was asked this question by my friend, and I can't answer it... But it makes me curious. I mean, I've seen computers today still running i5-2500k or i7-2600k in "high-performance" applications. I wonder if those CPUs have lost its performance just like old cars...
But here's the difference. CPU A were given adequate (probably too much) cooling. It was strapped to a massive heatsink, high-performance fans, tiny fans to cool the motherboard's VRM, and huge case fans. CPU A spent most of its life hovering at around 60 degrees C. Meanwhile, CPU B were given a more subpar cooling. The cooling was just enough to keep it from throttling or hitting TJ Max, but the CPU still ran much hotter at 90 degrees C with the same load.
Now here's the question. After, say, 5 years of heavy usage, would there be any tangible difference to the performance of those CPUs?? By performance I meant power usage or stability, not just outright "number-crunching" capabilities.
I never seen a CPU fail in my life, but I've heard that a CPU that has been worked hard at high temperature for prolonged period of time might require more power to achieve the same performance and stability, and thus runs hotter. If I were given both CPU A and CPU B 5 years down the line, would I be able to tell the difference between the two? I mean, do CPU really deteriorate with age??
I was asked this question by my friend, and I can't answer it... But it makes me curious. I mean, I've seen computers today still running i5-2500k or i7-2600k in "high-performance" applications. I wonder if those CPUs have lost its performance just like old cars...