Though all of the above posters are absolutely correct, I'd like to add that, in general, higher wattage CPUs do not necessarily translate to higher power consumption. The wattage ratings are not a measure of the actual average energy use for that CPU based upon any realistic model of how people will use their computers. They also fail to take into account (to my knowledge; I believe AMD and Intel derive their wattage ratings in slightly different ways, but in the end I believe they are supposed to be a guideline for cooling more than for power usage) low power use idle settings. For two similar CPUs such as the two you mentioned, I would make a wild guess and say that the actual power draw difference at the wall between the dual and quad core, most of the time, would be relatively low - maybe on the order of 10 watts (very roughly)? At 12 cents per kilowatt hour, and three hours of use per day, my very very rough estimate yields a yearly cost difference of:
(10 Watts) * (3 hours/day) * (365days/year) * (1 Kwatt/1000Watts) * (0.12$/KWattHr) = 1.3$
That is assuming the difference between the two cpus (identical systems, otherwise) is 10 Watts. A reasonable range for the difference would be, say, 1 watt to 40 watts (1 watt being much lower than I'd guess it is and 40 watts being much higher than I'd guess it is, except maybe under max CPU usage). That range means the cost range would between 13 cents and 5.20$. No matter what, the cost doesn't seem to matter.
All that just goes to show how you can do the sort of calculations that accord99 probably did, and that he isn't just pulling those numbers out of his butt.