Ok... THERMAL wattage is NOT the "same" as ELECTRICAL wattage. If it was, your homes heating bills would be far lower. Different forms of energy, different methods of transfer. Thermal wattage will always be lower than electrical wattage from the conversion (losses).
TDP is scalar, not a vector. Its the heat output required to be overcome at any given point, not a time rate of change. But I guess that can be argued how you are looking at it I suppose.
Directly @ op -> its well known HW monitor is glitchy, and Im pretty sure thats one of them. The only "true" way to tell what you're using is to get a wall mount monitor and subtract knowns (fans, gpus, etc) to get your actual usage.
As long as everything is running properly at proper temperautres, 60w seems normal, as I've seen people running "30w" with intels.