Hi. If I am charging an SLA battery using a constant voltage source and a resistor in series to limit the current (high wattage), how do I calculate the correct resistor value, assuming the battery voltage will not drop below 11.8 volts?
Somebody told me a while back that it's not R=E/I because the battery will never be charged from zero volts, but instead, more like > 11.5V. I need someone to confirm this so that the charging process doesn't take longer than it potentially should.
Let's just say I want to charge my 7.2Ah battery with 14.5V and maximum of 1A current (battery says 14.4-15.0V <2.1A initial current). R=E/I >>> 14.5/1A = 14.5 ohms (closest value 15 ohms). Is this correct or should I be using a slightly different formula?
Will switch go to float voltage (around 13.6V) automatically.
One more question: If battery says "Initial current < 2.1A", why do they call it "Initial current"? Why not "maximum charging current"?. Thanks.
Somebody told me a while back that it's not R=E/I because the battery will never be charged from zero volts, but instead, more like > 11.5V. I need someone to confirm this so that the charging process doesn't take longer than it potentially should.
Let's just say I want to charge my 7.2Ah battery with 14.5V and maximum of 1A current (battery says 14.4-15.0V <2.1A initial current). R=E/I >>> 14.5/1A = 14.5 ohms (closest value 15 ohms). Is this correct or should I be using a slightly different formula?
Will switch go to float voltage (around 13.6V) automatically.
One more question: If battery says "Initial current < 2.1A", why do they call it "Initial current"? Why not "maximum charging current"?. Thanks.
Last edited: