It seems to me that most of the power a CPU draws is converted into heat. I'm just a layman but I would guess that >90% of the power drawn is converted into heat (based on reading stuff like this: http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=44&Itemid=1&limit=1&limitstart=3).
So I'd argue the main reason TDP =/= a CPU's maximum power consumption is that the TDP doesn't measure the theoretically maximum power dissipation of a CPU. Specifically, when TDP is calculated, (i) spikes of power dissipation so short that they are irrelevant for cooling system design are ignored; (ii) code is employed which requires <100% CPU utilization (simply because it's so difficult to generate machine code that makes use of all available units of the CPU at the same time).
So my point is: to the extent TDP accurately measures a CPU's true maximum power dissipation, it's probably a pretty good measure of a CPU's maximum power draw. But OP, keep in mind that the CPU will dissipate a lot less heat than specified by the TDP under "normal usage." Under normal usage, the CPU often has so little work to do that it reduces its clock rate or turns off some of its units (or both).
EDIT: thinking of Intel CPUs here...