Difference between 45nm & 90 nm technologies

It's all a matter of heat and power comsumption. Intel's current 32mm designs run a little cooler and use less energy than some of the new amd bulldozer cpus. And testing results can always vary with different equipment. People complain all the time about the cpu or board temp reported by generic software. I tell them that the generic hardware monitoring software doesn't report accurate temps for some motherboards. And older 90nm cpus run warmer than the newer 45nm, which are already outdated. My sempron 140 45nm uses more energy than the i3 530, which has been out awhile, and has already been replaced by 1155 cpus like the 2120.
 
The difference is primary in power consumption, heat, amount of raw material used to create each microchip and the number of microchips that can be create from a single silicon wafer.

Shrinking down the die size reduces the amount of raw material used to create each chip. Since the overall design is smaller, less electricity is needed to power the microchip. Less electricity means less heat which can also lead to higher clockrates. Due to the physical size being smaller, more microchips can be created from a single silicon wafer which means the cost to produce each individual microchip is reduced.
 
As 01die said; the higher the nm the more power and heat it uses and produces. It also is more expensive to manufacture higher nm chips, if you dont know what nm means its like the architecture of the cpu.