Power supply big difference between maximum current and rated input current?

MysteryUser001

Reputable
Mar 7, 2016
15
0
4,520
Acording to my power supply specs the input voltage is 100-240volts and the input current is 10 amps
If the power supply is 550W then 550W/100V= 5.5 amps which is half the rated current input
my other power supply is 850W and it claims 12 amps input current and 100-240V
But 850W/100V=8.5 so even at the lowes voltage and full load the power supplies dont get up to the rated input current
So why the difference? Is it for safety reasons?
Most power cables are rated at 10amps at 250volts
Thanks in advance
 
Solution
I'm not sure if I understand what your concern is, exactly. So if I'm off base, I apologize.

But the line voltage on the AC side of the power supply has other factors that enter into the equation that aren't present on the DC side of the PSU. Capacitive reactance, inductive reactance, as well as normal resistive impedance. These all work together to make up what is referred to as the power factor. The difference you are seeing is what wattage (power) is lost due to those other impedences.
I'm not sure if I understand what your concern is, exactly. So if I'm off base, I apologize.

But the line voltage on the AC side of the power supply has other factors that enter into the equation that aren't present on the DC side of the PSU. Capacitive reactance, inductive reactance, as well as normal resistive impedance. These all work together to make up what is referred to as the power factor. The difference you are seeing is what wattage (power) is lost due to those other impedences.
 
Solution
Thank you for your answer.
I understand what you say but then the power supply efficiency would be really low.
In the 550w psu example it would be
550/(10A*100V) then it would be 55% at full load so it couldnt be 80+ gold.
To be more precise my concern is why the input current is rated so high (the psu in question is a thermaltake toughpower 550w)
 


Except (and correct me if wrong Clutch since you probably know better) the reactive power does not mean more AC, it means more reactive power in the grid. You can have a low power factor ratio and still have high efficiency DC/AC. The power factor would only affect industrial consumers bill-wise, at least in America, but efficiency would affect everyone. Good PFC only helps increase the active / apparent power ratio and does not affect the unit's AC/DC ratio.
 
@ turkey3_scratch
Actually what you are saying about the end result of power factor loss on the consumer or end user is correct. That is why PSUs only have an 80% or so efficiency rating; power factor loss. Some of the power we are paying for (on the AC side of the PSU's transformer, capacitors, choke coils, etc) is wasted in heat, inductive reactance, and capacitive reactance. There is always more power being consumed on the line side of a PSU than is being produced on the load side.