Not sure what you meant by that, but I'd like to add another variable.
Temparature. Since your question revolves around how hot or cool the chip gets, Assuming the temps increase proportionally to the load they have, say from 20C 0% to 80C 100%, meaning 1C for every 2% load increase, say. They'll both increase temps uniformly, since they have same efficency (since double the power and performance).
Going by that, both chips should be at 55C on 50% load. So when both have been assigned a task, say to render a 100sec video, and to make it fair, we'll limit the better chip wth double performance to 50% so that they both take equal time.
Now, better chip at 50% will operate at 55C while weaker chip at 100% will operate at 80C. So technically, the weaker chip will be hotter. But since Heat released will also include Power consumed, so 50% of the double TDP CPU will still be equal to 100% of weaker CPU's TDP. Like if weaker is 100W TDP and stronger is 200W, then 50% of 200W is still 100W.
So in theory (and only in theory), both shall produce same amount of heat in equal time.