Question USB 1 Or 2 Computer Port Maximum Amperage?

Feb 21, 2019
9
0
10
I have a commercial device that normally operates at 5 volts and 425 milliamps. Sometimes the device can spike for a second or two and draw as much as 550 milliamps. I want to run this device on a computer USB 1 or 2 port that is rated for 5 volts and 500 milliamps. When this device spikes to 550 milliamps, do I run the risk of damaging a USB 1 or 2 computer port? Do USB 1 or 2 computer ports have some sort of protection against amperage spikes? What’s the maximum amperage a USB 1 or 2 computer port can tolerate without being damaged? Please no opinions. I'd appreciate known facts or specs only.
 
Last edited:
Both, USB 1.0 and 2.0 port are capable of delivering up to 500mA (0.5A) according to standards. But that does not mean that all manufacturers follow standards down to the T.
Computers should have protection against devices drawing more power that can be delivered by cutting power to the USB ports.
There are motherboards that do accommodate devices when those spikes occur.
 
I’ve been testing the limits of a couple of USB 2.0 ports on two different old throw away computers. So far, I’ve been able to get up to 857 milliamps at 4.79 volts on one computer and 926 milliamps at 4.40 volts on the other. I only keep it at this level for about 10 seconds and it hasn’t fried either port. I could even go higher if I wanted. I find this amazing since the USB 2.0 spec is only 5.0 volts at 500 milliamps.

I’m beginning to think the USB 2.0 spec of 5.0 volts and 500 milliamps only applies to the underlying energy or watts the port should draw on a continuous basis. That is, the 2.5 watts of energy the port and electronics is subjected to is the important factor and not so much the individual volts or amps.

Say I run at 4.1 volts and 610 milliamps which is 2.5 watts does that have the same effect on the electronics as running at 5.0 volts and 500 milliamps which is also 2.5 watts?
 
Say I run at 4.1 volts and 610 milliamps which is 2.5 watts does that have the same effect on the electronics as running at 5.0 volts and 500 milliamps which is also 2.5 watts?
I wouldn't look at it that way, the PC is still trying to output 5V. It's just hitting the limits of what the voltage regulator can output, combined with additional losses from resistance. To the best of my knowledge the effect on the circuitry would be largely determined by the current specifically.
 
Now a question for some of you smart double E guys.

Experimentally it’s difficult to use my device to measure current draw at various voltages. I’d like to replace my device with a potentiometer and just dial in a resistance at different voltages and then measure current draw. How does the resistance of a circuit vary with voltage? My results are strange and show a fairly linear area then the graph flattens out as voltage increases. It looks like the equation for resistance versus volts for a circuit is some sort of 1/x function. Is that correct? I was hoping to use least squares to find a simple linear equation to use.
 
If all you had as a load was a potentiometer then resistance would be whatever you set the pot to, and wouldn't change with voltage. I = V/R.

I have no idea what the internals of your "device" are, so it's hard to say what it's current/voltage profile would look like. Are you saying that current decreases and voltage increases, but then levels off after voltage reaches a certain point?