Higher power supply wattage uses more electricity?

Status
Not open for further replies.

IndirectHero

Distinguished
Dec 22, 2011
31
0
18,530
Does having a super high wattage power supply use more electricity than a less powerful one? Example: 1000 watt power supply vs a 400 watt power supply. Let's say the computer only uses about 350 watt max.
 


A 1000watt power supply ALWAYS draws more than a 400W.

Depending on how efficient the power supply is the less power it draws.

For example a 80plus Gold certified power supply has a very good 90% efficiency so it draws 10% more than 1000Watts of the grid. 1100Watts in other words. It needs to draw 1100Watts to deliver 1000Watts.

The same goes for a 400Watt PSU. It needs for example 450Watts to deliver 400 Watt.

So a 1000Watt PSU draws roughly double the power of the grid constantly than a 500 Watt even though it is not used and usually disapates as heat or vibration.
 
+1 for esrever.
The system takes what ever amount of DC power it needs from the PSU. The efficiency of the PSU determines how much AC power it draws from the wall socket to meet that DC power draw.
:eek: Drawing 1100W to provide 350W DC power to the rig would mean that 1000W 80+Gold PSU efficiency is like... under 30%.

 


because PSU's are more efficient in a sweet spot, then you are not wrong. A larger PSU will be effectively slightly underloaded and so in an attempt to provide 300W might need to pull 340, a 400 PSU might only need to pull 320 provide 300.

So a vastly oversized PSU will therefore be slightly more innefficient, but the difference is small, certainly much less than the cost of 2 PSU over a couple of builds.

Your last sentence is utter rubbish, in the context of a PC needing a certain amount of power i.e. 350W then the 1000W psu does not draw double the 400W PSU.
 
it depends on the 80+psu's efficiency rating but its a given that 20% is bigger on 1000w being 200w than it would be on a 500w 100w... but because your running at a comparatively low voltage on the 1000w you may not get into the 80%+ efficiency so your psu may well end up drawing 30% extra due to reduced efficiency at lower wattages.

basic system =450w full loaded...
run on a 600w psu you will be in its 80%+ so will draw 20% more than 450w... so 450w +20%= 540 total draw from the wall
run on a 1000 psu you wont draw enough to get 80%+ efficiency so the psu will draw maybe 30%+ 450w +30% = 585w

overall the 1000w will use more electricity. on the same 450w system
 

You need to go back to the books on this one.
80+ means that the psu will provide 80% or more efficiency at anywhere between 20 and 100% output.
450 watt load on a 1kw psu is considered by most to be close to optimal ( almost hit's that magic 50% load number everyone cries for )
 
Hi Hexit, I dissagree, if the 1000W supply @ 300W is only 70% efficient (which is an unreasonabley low estimate @300W) and the 500W supply is 85% efficient at 300W.

The 1000W would then draw 300/0.7 = 428W and the 500W supply would draw 300/0.85= 352W 428/352 is 1.21 i.e. 21% more power is used. = 73W

Given a more normal estimate of 80% and 85%
300/0.8 = 375, 375/352 = 6% more power = 23W

The size of PSU is only relevant in that the efficiency at a given wattage will be different, yes 20% of 1000 is higher than 20% of 500, but that only matters @1000W at which point the 500W has failed.

The other case is at very low, idle, power levels say 100W, here you might have 60% efficiency for the 1000W and 80% for the 500W.
100/0.6 = 166W and 100/0.8 = 125W 166/125 = 32% more power consumed, or to be useful 40W more power consumed.
40W extra for 24hrs a day is about $35/£35 of electricity.
Under load (300W) the extra electricity is about half that.
 
its just an estimate to get the point across thats all... jeez calm down....

the 450w system is the base line in that it actually takes 450w without any efficiency modification... which i guess is where you think im making a mistake...
so in my model the pc will take 450w and waste 20% as heat/noise (thus my 450w+20%) and so on which give the end result on the 600w psu then i apply the same method to the 1000w psu.
but i take into account the the point where the 450w appears on the 1000w efficiency curve .. the pc will still draw 450w but because the 1000w psu's eficentcy curve will be different and will reach 80% at a higher wattage thus i applied a 30% aditional power to represent this...

im taking the fact that efficiency builds on a curve the more power you use in relation to the size of the psu if there both rated at 80+ my numbers are just typical examples nothing more...
so no i dont think my mental model is off, my math maybe. but not the idea behind it...

gimme a few and il post a graph that may explain it better as im dyslexic and that may be hampering me more than i think.


 
you're one of dozen or posters that I actually trust to give out decent answers, which i why I was a little taken aback by your answer. No need to do sums other than for self, i've done them, your words indicate you know what you are talking about.

Also i think myself and delluser posted at the same time, so it wasn't a sustained attack :)
 
going out on the piss about now, will consider, I have a different set of curves in my mind where the 500 is half as long as the 1000, and hence the 450W point is in the same place on the x axis, but there are two points (1 for each supply) this may equate to what you have.

I'll look at the BB code for how you posted that.
No probs.
 
its just the code you get from photobucket for its tumbnails...
URL=http://s259.photobucket.com/albums/hh314/jaymack71/?action=view&current=psueftcy.jpg] IMG]http://i259.photobucket.com/albums/hh314/jaymack71/th_psueftcy.jpg[/IMG][/URL]

i took the [ off the front url and img
 


it doesnt really matter as its a representation not an actual plot of an existing psu. X is 0-100 percent efficiency and y is the wattage. its just a quick mock up and not meant to be absolutely accurate.
its just ment to show the 10% difference and no its not to scale.
 

There's not going to be a 10% difference, unless your now talking about different 80+ ratings along with the wattage difference, we didn't start off there.
80+ 1Kw
Picture0004.png

80+ 620w
Picture0003.png


^ Corsair HX's if it matters to anyone

Looking at those charts, I'd say no more than maybe a 2% difference ( if that ) at the same 450w load
 
oh! ffs... do you actually know what a mock up means...
i didnt name any brand i just pulled numbers out my ass to make the point that 1000w psu's have to use more overall wattage to get 80+ on there efficiency scale...
at 200w it's likey that a psu will be 70 percent efficient. at 300 watts 75 percent efficient at 400 80% and at 500w it peaks in efficiency, then when it goes past 900 efficient drops off again but at a quicker rate ... its called an efficiency curve for that reason... and is what i was explaining with my mock up...


seriously pall you have gone from trying to prove im a know nothing idiot, to being a complete dick who has a bee in his bonnet about anything i post...

there was no real psu in my model
my model was just that, a model to represent. nothing more...


let me guess you play WOW!...
 

nul_97

Honorable
Jan 5, 2013
1
0
10,510


Pity that most of the posters appear to have a poor knowledge of electronics, most "switched mode " power supplies are near 80% efficent - THATS WHY THE ARE USED - no need to dissipate large amounts on power in a regulator.


so power out / % efficency = draw drawn.

nothing to do with the max power rating.

 

Dan Johnson

Reputable
Feb 25, 2015
1
0
4,510
my shot at this. For me having a larger PSU is knowledge that I can upgrade hardware without having to get a new PSU ... cpu/gpu/dvd burner/ hard drive ect all require power .... so for example!!! ( gpu draws 300w cpu draws 150 dvd burner 100 ect ect [made up numbers] you will require a PSU that is 600+) i am not an expert on the efficiency of a PSU. Just know that if x requires A + y requires B, than xy => AB
Again No Expert on this by any means ... actually was looking to see how much power certain programs draw to run ... now i am thinking that graphic based programs probably just make the GPU run harder making it draw in the higher end of its max
 
Status
Not open for further replies.