wanna trade links? oh you didn't provide a link, just a quote . .
😛
50% Load Myth
On the 650W my peak loading condition will be at exactly 50% load. I will be getting about 82% efficiency. At idle I will be at about 15% load--call it 79% efficiency. Multitasking will be 23% load--80% efficiency. Heavy gaming will be 38% load or about 81% efficiency.
On the 400W my peak loading condition will be at about 80% load. I will be getting about 81% efficiency. At idle I will be at about 25% load--call it 80% efficiency. Multitasking will be 38% load--81% efficiency. Heavy gaming will be 63% load, or about 81% efficiency.
Let's average those figures.
650W average efficiency = 80.5% efficiency
400W average efficiency = 80.75% efficiency
Hey, a tiny difference in efficiency--in favor of the smaller PSU. How about that. Now consider that a quality 400W power supply with that kind of efficiency curve can be had for about $40, while a 650W PSU with that kind of efficiency might run $80. A $40 difference. So that's $40, plus slightly (slightly) higher electricity bill, by going with the 650W instead of the 400W.
Then you ask, "But what about upgradability?" Great point! Except that nine times out of ten some gof comes along and says, "No, you want to run your PSU at 50% load for the best efficiency!"
... Thus starting the cycle over.
so as given in that example getting a higher wattage PSU effects overall efficiency. or if you like images:
Debunking Power Supply Myths
http://images.anandtech.com/reviews/psu/2008/psu-power-myths/eff1.png
The first system causes this high-performance power supply to only run at 73% to 81% efficiency, depending on input voltage. Obviously, there's absolutely no need for a 900W power supply if you're running this type of computer.
The midrange system looks quite a bit better, allowing the PSU to run at 80% to 88% efficiency, although the latter only occurs at maximum load. Considering the vast majority of systems rarely run at 100% load most of the time, real-world efficiency will average closer to 82%. Office work and Internet surfing in particular will be at that level.
For the third system, a 900W power supply actually might start to make sense. It's still more than you need, but having a bit of extra room to grow isn't a bad idea. This system idles at over 300W, so it achieves a minimum 86% efficiency with 120VAC. When running a game or other demanding task, the PSU is finally able to reach its potential and provide 89% efficiency with 230VAC (or 87.5% with 120VAC).
both those examples take into consideration both idle and full load usage.
to repost a chart pertaining to the OPs actual usage:
http://i.neoseeker.com/neo_image/202359/article/evga_gtx660ti_sc/powerusage.png
with a single GTX 66ti SC they won't go over 100 watts idle and less than 300 watts under load. so going with a 750 watt PSU max load is 40%. not close to 50% which is the neighborhood of maximum efficiency (50%-60% depending on the PSU).
add another 660ti that runs 18 watts idle and ~150 watts under full load for a power consumption of 117/441 watts and a 750 watt PSU still doesn't make any sense unless it is going full load ~100% of the time with the idle load being under 20% of the rated wattage; which is 150 watts.
i follow jonnyguru also. you quoted a
general opinion without any specifics. i believe you are using it out of context. and btw, the mularky i was referring to was:
so you get better grade parts, more rigid specs, and most likely tighter QC
like i said before; getting better/higher quality components would be apt to not need any more power.
the build posted would run fine with a 450-500 watt PSU, a 550 with four 6 pin power connections would be able to run a 660ti SLI set up. if the OP wants to get an
"oversized" PSU than a 650 watt would be fine. a 750 watt PSU is too much. a 750 watt gold rated PSU would be a waste of money when they sit there idling with ~80% efficiency.
not a good recommendation.