Acording to my power supply specs the input voltage is 100-240volts and the input current is 10 amps
If the power supply is 550W then 550W/100V= 5.5 amps which is half the rated current input
my other power supply is 850W and it claims 12 amps input current and 100-240V
But 850W/100V=8.5 so even at the lowes voltage and full load the power supplies dont get up to the rated input current
So why the difference? Is it for safety reasons?
Most power cables are rated at 10amps at 250volts
Thanks in advance
If the power supply is 550W then 550W/100V= 5.5 amps which is half the rated current input
my other power supply is 850W and it claims 12 amps input current and 100-240V
But 850W/100V=8.5 so even at the lowes voltage and full load the power supplies dont get up to the rated input current
So why the difference? Is it for safety reasons?
Most power cables are rated at 10amps at 250volts
Thanks in advance