egilbe :
45 watts difference is enough to pay for the itself? I'm not sure what you are paying for electricity, but it would take over 10 years at that rate to pay for itself.
If you are going to make assumptions, you should say what they are. So should the person you quoted.
10 years of how much per day usage?
1 hour? If so then it would be possible to get 10 years shrunk down into 1 year just by using the PC 10 hours a day.
Also, its not important to recover the entire cost of the processor. It is more important to recover the gap between worse processor A and better processor B. Once that gap is recovered, you are in the black purely on the numbers. Thus to "pay for itself" you really only need to cover 1/2 to 1/3 of the total cost of the processor, ie the difference between most Intel chips and their cheaper higher wattage "equivalents".
Case in point the FX-8150 (at $190 on Newegg) usually comes up compared with a 2500k (and lacking btw, at $220). The difference in pricing is $30 out of $220 or about 1/7 the total cost. It is only that $30 that needs to be recovered for there to be zero downside to choosing the Intel chip, even in terms of the up front purchase price.
Rounding some numbers off for simplicity (and ignoring that OCing is heavily done with both processors and that skews the data even more towards getting the Intel chip), the usage of the Intel chip saves you about 1/20 of a KWH and at 10 cents per KWH, every hour the computer is used saves about 1/2 of 1 cent. 2 hours makes 1 cent. 200 hours makes 1 dollar. 6000 hours makes $30.
Where I live, electricity varies from 7.2 to 13.2 cents per KWH using plans from about 100 different providers with widely varying terms of service, so I went with a value in the middle both for ease of calculation and because values lower often have service terms that make up the difference anyway. Plenty of plans at 10 cents here are 100% renewable energy sources as well, for those environmentally aware.
Anyway, at 6 hours a day that is 1000 days to recover the gap which is about 3 years. At 12 hours a day the time shrinks by half down to a year and a half. Most hardcore gamers will probably fall somewhere in the difference between those. Some may even push the break even point down pretty close to 1 year.
It may really take 10 whole years to recover the entire $220, but that is an entirely flawed way to look at cost-benefit analyses anyway, regardless what you assume "pay for itself" means.
Going back to above where I said ignoring OCing, its pretty common for gamers not to ignore this and at +2% power gains +1% performance average rate the higher wattage FX would scale up in wattage much faster than the lower wattage SB chip would.
Saying the OCer were wanting a +50% performance in both cases (taking the 2500k to 5GHZ vs 8150's 5.4 GHZ) which is pretty aggressive, that would double the gap between the high and low and half the numbers above, making it quite easily to recover the difference in one year for hardcore gamers. Even if you tamed the OCs some down to a more normal +1/3 (2500k goes to about 4.4 GHZ and 8150 goes to more like 4.8 GHZ) it would still require less than 12 hours a day to recover the whole $30 in one year.
Anyway, that is all just a serious effort to look at the numbers for direct cost. They don't even begin to factor indirect costs like the fact that the CPU fan runs at a lower RPM with the SB as compared to the FX, the cost of added strain on the PSU, and the effects of higher temperatures on other hardware in the computer. Those figures serve to slant the results even more towards Intel instead of AMD.
- edit - clarity