Are AMD CPUs/GPUs really that hot?

kurac_palac

Commendable
Jan 26, 2017
14
0
1,510
People constantly complain/joke how AMD CPUs and GPUs are overheating and can turn an entire room into a sauna. Is this true or is it just some Intel/NVIDIA fanboy or other master race gloating?

My Intel's E8400 idle temperature is 38°C and Radeon's HD 4850 is 46°C.
 
Solution
The problem is age and architecture.

Some parts are less efficient than others. The FX series was a failed attempt, they had to run extremely hot and draw alot of power to push higher clock rates to even manage usable performance. AMDs R9 series was fairly similar, they had to sacrifice thermal performance to manage raw performance. (That being said, they were managed well with aftermarket cards).

There is a trend that AMD generally runs warmer, but thats due to their struggling budget and RnD. They dont have the capital to fine tune.
AMD CPU heating is mostly related to the old FX-8000 series, which is almost only one that was relatively competitive with Intels until... Ryzen.
They ran hot, really hot, especially overclocked what you "needed" to get money's worth out of it.

For GPU's, again.. older models like R9 290X and 390X were kind of... hot. RX 480 isn't as much but... It isn't all that much faster either.
Things are looking like they'd change with AMD's RX 500 series and Vega, possibly, neither is out yet so.. heat issues on them are speculation.

So.. yes, AMD CPU's ran hotter
Yes, AMD GPU's were generally hotter than Nvidia's.

Sauna idea? meh.. if Room temperature is 20 to 22 at most and you don't live in broom closet with door closed, it is not an issue.

Edit: Also all said "hot" temperatures are on full use, not on idle. Idle and heat is NOT an issue.
 


It all depends on CPU/GPU architecture. Nvidia is also famous for such extremely hot parts like Fermi (GTX280-GTX480), or FX5800. Intel's P4 processors were also much hotter than AMD's chips. At this point- AMD released Ryzen CPUs and it seems it is more energy efficient again, but their GPUs are still on the old architecture that uses more power than nVidia's parts. Also, chips being hot is not a real problem- that is mostly used for just marketing and fanboy wars. Important is power efficiency, and costs associated with making/using less power efficient product.
 
The problem is age and architecture.

Some parts are less efficient than others. The FX series was a failed attempt, they had to run extremely hot and draw alot of power to push higher clock rates to even manage usable performance. AMDs R9 series was fairly similar, they had to sacrifice thermal performance to manage raw performance. (That being said, they were managed well with aftermarket cards).

There is a trend that AMD generally runs warmer, but thats due to their struggling budget and RnD. They dont have the capital to fine tune.
 
Solution


And it's even more true with the FX-9XXX series. The FX-9590 is essentially a factory overclocked FX-8350. The 8350 has already had overheating and overclocking issues associated with it. And both CPUs are 220W on top of that. Which means add any GPU and it needs both a monster cooling solution and a PSU with a minimum of 1KW. So add the factory overclocking to that CPU and it's essentially a powder keg waiting to happen.