I'm currently looking for a suitable powerbank / portable battery, and I've found one that I want, though it has 2 different types of ampere outputs; a 1.0A, and 2x 2.1A - however my mobilephone is built for 2.0A
Normally a unit will try consuming the current it needs/wants even if the cable providing the power isn't capable of outputting that current if the electricity goes straight into the unit.
But according to a contact of mine who used to be an industrial researcher for batteries and fuel cells things doesn't work that way when we're dealing with batteries, that it's the power supplier that more pushes in whatever current is on the supplier's output, and that the mobilephone battery receives that ampere - unless it has a system adjusting it such as BMS (found in car batteries and UPS'es).
So if I were to charge a 1.0A mobilephone with a 2.0A charger the I will push 2.0A current into the 1.0A battery - which means it will charge faster (which it really does in the real world to my experience), but it may perhaps decrease its lifetime and increase wear and tear - at least according to this person that I know, however numerous other people I've talked to suggests there is no harm in this... so I get a bit confused.
There is a lot of value in a lot of people suggesting I shouldn't be worried about it, but on the other hand I have a person I know that is more or less an expert who suggests that it may put the battery under extra stress.
In my particular case I have a mobilephone at 2.0A, and a powerbank of 2.1A - is there any risk that the extra 100 mA can decrease the lifetime and increase the wear and tear of my phone battery?
Normally a unit will try consuming the current it needs/wants even if the cable providing the power isn't capable of outputting that current if the electricity goes straight into the unit.
But according to a contact of mine who used to be an industrial researcher for batteries and fuel cells things doesn't work that way when we're dealing with batteries, that it's the power supplier that more pushes in whatever current is on the supplier's output, and that the mobilephone battery receives that ampere - unless it has a system adjusting it such as BMS (found in car batteries and UPS'es).
So if I were to charge a 1.0A mobilephone with a 2.0A charger the I will push 2.0A current into the 1.0A battery - which means it will charge faster (which it really does in the real world to my experience), but it may perhaps decrease its lifetime and increase wear and tear - at least according to this person that I know, however numerous other people I've talked to suggests there is no harm in this... so I get a bit confused.
There is a lot of value in a lot of people suggesting I shouldn't be worried about it, but on the other hand I have a person I know that is more or less an expert who suggests that it may put the battery under extra stress.
In my particular case I have a mobilephone at 2.0A, and a powerbank of 2.1A - is there any risk that the extra 100 mA can decrease the lifetime and increase the wear and tear of my phone battery?