rds1220 :
I though Intel runs cooler. The Sandy Bridges and Ivy bridges has a lower TDP and should run cooler than the older Phenom II's and Bulldozer.
First of all, TDP !+ temperature. You must test with the same (aftermarket) heatsink to accurately tell which CPUs run warmer as the stock heatsinks from AMD and Intel differ quite a bit. Intel's stock push-pin aluminum hockey puck heatsinks for their "mainstream" CPUs suck. The only decent ones they make are the tower ones for the recent Extreme Edition CPUs. AMD's heatsinks for their 125+ W TDP parts like FX-8150s are pretty good units with copper heatpipes and will cool an otherwise identical CPU much better than the crappy Intel one.
Secondly, TDP is "thermal design power," not maximum dissipated power. The TDP is a figure to roughly guide system integrators in figuring how large of a heatsink to use and how much airflow a particular CPU needs. Intel and AMD figure TDP quite differently, especially now that CPUs have a Turbo mechanism to "use up unused thermal budget" to increase performance.
AMD CPUs are designed to never exceed their TDP under full load. AMD has an integrated power sensor on their Family 15h+ (Bulldozer/Piledriver/etc.) units. Power dissipation at full load at full rated non-Turbo speed is below the TDP limit. The CPU will Turbo to the maximum single or multi-core per-defined Turbo speed for the particular CPU if its current power consumption is less than the TDP, or run at lower than maximum Turbo speed to keep current power consumption <= TDP.
Current Intel CPUs are designed to on average be at the TDP over the long term but can exceed the TDP for short periods of time. The CPU will allow itself to run at Turbo speeds where the CPU can use some multiple of its TDP for 1.5x some value in seconds (defaults are 1.25x TDP and a value of 28 seconds), then reduce Turbo to where the power consumption == TDP as long as the CPU does not overheat. The aim behind this is that the CPU can run at more than its TDP for short bursts when the system was previously idle without overheating because the heatsink is cool and can absorb a higher rate of heat production for a short period of time without becoming too hot. The over-TDP Turbo then backs off to at-TDP Turbo when the heatsink gets warmed up to "full" temp. This appears to be accurate but I may have missed something, there is an entire 132-page manual just for the LGA1155 CPUs' electrical and thermal characteristics...
So in short, it's difficult to accurately say which CPU will run how much hotter due to the big differences in heatsinks and Turbo modes. If you run your CPU at sustained full load for a long period of time and use the same heatsink, AMD and Intel TDPs appear to be pretty comparable and a 77 watt Intel CPU will run a little cooler than a 95 watt AMD unit. However, if you don't routinely peg cores for very long, multiply the Intel TDP by 1.25x and that should compare to an AMD CPU- essentially a 95 W Intel TDP == 125 watt AMD TDP in that case.