xenova :
it's not about lowering the energy bill. It's about more power consumption generating more heat for the computer.
The power consumption argument that some people are focusing on doesn't make sense.
Because, you are getting more work done for the power.
People often misread the power consumption charts on max watts like when they run prime95 for 10minutes.
CPU1 uses 100watts and is at this temp
while CPU2 uses 125watts and is at another temp.
But if you review how much work was done, cpu2 does may end up doing 25%+ more work.
Some of the Toms articles went into this and calculated the actual power consumed for a particular LOAD.
The end result was the fastest chip (at stock voltages) used the least power. While the peak wattage was highest, it finished faster, so due to overhead, the total electricity consumed by the fastest chip was lower.
For games, there is a twist in that games will max out and use100% cpu and just throw in extra frames here and do more work for you.
If you want to reduce the power usage,
1) OC multiplier only, don't change voltages on your CPU which should keep at least the same efficiency (actually increase it a slight bit due to amortizing overhead). You are now getting the most work done for your electrons-even if it's a meaningless pixel drawn here or there. When you resort to increasing voltages to get higher clocks is when you are losing efficiency.
2) If you are concerned about heat is generated, then don't overclock the multiplier or even throttle and underclock it, saving electrons at the cost of a few fewer fps.
The stuff about the ivybridge TID is a moot point.
If you are asking about the Power Efficiency, which is what I'm talking about between amd/intel, then go with the intel.