[citation][nom]JonnyDough[/nom]I'm not sure where you got your figures. Maybe you should research it more instead of assuming you know everything like you usually do. It's pretty simple. Take the difference in wattage per hour, multiply that by the cost per hour, and then multiply it by an average use over three years time. You'll more than pay for it.With as much time as you spend on a PC posting know it all stuff all over the forums all the time, Al Gore would have fits.[/citation]
Oh no, I ask for the math that you say is sound in determining the wattage difference so that I can see how it compares to my own to see if I've made a mistake and you have a fit and start mocking me. That certainly doesn't give me any reason to give your math much credit. Still, I suppose that I'll just throw mine in.
http://en.wikipedia.org/wiki/80_PLUS
80+G efficiency requirements:
20% load : 87% efficiency
50% load : 90% efficiency
100% load : 87% efficiency
80+P efficiency requirements:
20% load : 90% efficiency
50% load : 92% efficiency
100% load : 89% efficiency
For a 400W PSU, 20% load is around 80W, 50% load is around 200W, and 100% load is around 400W. 100% load is pretty much never going to happen unless you didn't get the right PSU for the job, so there's little point in considering it anyway. We'll focus on those 20% and 50% points because those are reasonable.
These wattage points for a PSU of exact Gold efficiency are ~92W and ~222W. These wattage points for a PSU of exact Platinum efficiency are ~89W and ~217.5W. These numbers are found by dividing the power load from the hardware components by the efficiency where the efficiency is the efficiency percentage as a decimal (IE 20% load is 80W divided by 0.87 for Gold and 80W divided by 0.9 for Platinum).
Those are differences of about 3W and about 5W. Only at 100% load would the difference become about 8W and like I said, if you're running a PSU at around 100% often and for long periods of time, then you're doing something wrong because it greatly increases the degradation of a PSU to run it near 100%. Incidentally, it also increases degradation of a PSU to run it near 0%, but for different reasons.
Furthermore, it's not until you get into differences of a few dozen watts where you're likely to pay off a $20 or so difference in cost between the two over a few years unless you have huge electricity costs compared to the averages for the continental USA.
I'll also take this oppurtunity to mention how different PSUs often vary a little form the exact specs of Bronze, Silver, Gold, and Platinum. There are some Golds that are with Platinums in efficiency and vice versa. Heck, I can even name a crappy Gold from RaidMax that only has Bronze efficiency despite its rating, granted it's a dirt cheap model.