I used to have an AMD FX8350, which has a TDP of 125W as we know it, and it ran incredibly cool at stock clocks under a Corsair H80i. However, when I switched to an Intel i5-4670K with a TDP of 84W, the load temps were 20?C higher than the FX8350 on average. This was a while back but I do remember adjusting for ambient temperature, and the results were from the same Corsair H80i using the same system monitoring software. After I delidded my 4670K, the load temperature dropped by about 10?C, but I still ultimately do not understand why my FX8350 ran cooler than my 4670K, also at stock clocks and even after delidding.
As far as I understand, TDP stands for Thermal Design Power, and is meant to reflect the expected heat output from a processor under typical loads. So this means that my FX8350, with its higher TDP, should have ran hotter than my 4670K, but that was not the case. Can anyone explain what could have possibly been the reason for this?
On a separate note, I see that a lot of people misinterpret TDP for power consumption, and automatically think that a power hungry processor will inherently produce more heat, ignoring the architectural power efficiency of the said processor. Some rumors are suggesting that an upcoming AMD GPU will have a 300W TDP, and that has got people worried, but mostly about power consumption. We know by definition of TDP that this actually means it will be a very hot GPU, but what do you think it will indicate about power consumption? If AMD has managed to improve the power efficiency of this GPU, is it possible for it to consume the same or less power than other GPUs of lower TDP?
As far as I understand, TDP stands for Thermal Design Power, and is meant to reflect the expected heat output from a processor under typical loads. So this means that my FX8350, with its higher TDP, should have ran hotter than my 4670K, but that was not the case. Can anyone explain what could have possibly been the reason for this?
On a separate note, I see that a lot of people misinterpret TDP for power consumption, and automatically think that a power hungry processor will inherently produce more heat, ignoring the architectural power efficiency of the said processor. Some rumors are suggesting that an upcoming AMD GPU will have a 300W TDP, and that has got people worried, but mostly about power consumption. We know by definition of TDP that this actually means it will be a very hot GPU, but what do you think it will indicate about power consumption? If AMD has managed to improve the power efficiency of this GPU, is it possible for it to consume the same or less power than other GPUs of lower TDP?