Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Internal resistance of a charging battery

Status
Not open for further replies.

Rusttree

Member
I'm trying to get an accurate calculation of the initial internal resistance of a charging battery. I'm working on a project that involves a DC power supply charging a 24V battery. The power supply is limited to 50A.

I fully drained the battery and measured the open terminal voltage at 13V. I then connected the charger, which immediately pegged at 50A. The charger drove the voltage to 22.5V. The simple calculations say:
R = (22.5-13)/50 = 190mOhm

Is that the whole picture? Or is there a resistance associated with the battery charger that I need to take into account as well? If so, is there a trick to expose that value?

I need to get an exact value because the design calls for two 24V batteries in parallel, but I only have the resources to test one right now. If I have a good estimate of the internal resistance when the battery is fully drained, I can predict the circuit behavior when the 2nd battery is added. Thanks!
 
You don't have to worry about the resistance of the power supply in that situation, because it is in current limit. I guess that you are sure that the current is 50 A with the 22.5 V load.

Battery resistance is quite complicated. It also varies with state of charge, current, temperature and battery condition. I suspect that the charging resistance is more than the discharge resistance. Also, you should maybe look at how the voltage varies with charging current. You will probably find that the dead battery will jump to around 20 V with just 10 A or so, and the voltage would be only a bit higher than 22.5 V if you supplied 100 A to it.

I think that to get consistent readings you should work from a starting point near 24 V.
 
I'm trying to get an accurate calculation of the initial internal resistance of a charging battery. I'm working on a project that involves a DC power supply charging a 24V battery. The power supply is limited to 50A.

I fully drained the battery and measured the open terminal voltage at 13V. I then connected the charger, which immediately pegged at 50A. The charger drove the voltage to 22.5V. The simple calculations say:
R = (22.5-13)/50 = 190mOhm

Is that the whole picture? Or is there a resistance associated with the battery charger that I need to take into account as well? If so, is there a trick to expose that value?

I need to get an exact value because the design calls for two 24V batteries in parallel, but I only have the resources to test one right now. If I have a good estimate of the internal resistance when the battery is fully drained, I can predict the circuit behavior when the 2nd battery is added. Thanks!


Hi,

In order to really measure the equivalent series resistance you have to use a pulse measurement system. It's similar to measuring the ESR of a capacitor. Although not exactly that simple doing it that way would give you an idea what the resistance really was. That's if you really need to know this.

The reason for using a pulse is because as the voltage is applied the state of charge changes and we wont know what the 'internal' (past the ESR) voltage really is so there's no way to apply Ohm's Law that directly. With a pulse we can make the measurement quickly before the state changes too much and that makes it possible to use Ohm's Law. Of course this requires a micro controller of some sort, or some fast measuring system that can be sync'd to the pulse time.

The internal resistance is really more complicated than a single resistor, but that will give some idea what is going on and might provide for some comparative data that might be used later to compare to other batteries of the same type or to monitor over the age of the battery.
 
Thanks for the responses. I suspected it wasn't as simple as Ohm's Law using the simple measurements I had taken.

Here's the whole motivation for my post. We have a generator connected to a battery charger (it's really an AC/DC rectifier with some battery smarts in it). Downstream of the charger are two 24V batteries in parallel and a load in parallel. See the diagram attached. There is a concern about what would happen if the system is turned on when the batteries are fully drained. Since the load is in parallel with the charging batteries, and the charger is limited to 50A, the load will subjected to the lower DC voltage. If the depleted batteries drive the charger's voltage down too low when the generator is turned on, the load (which is sensitive electronic equipment) may experience undervoltage conditions until the batteries recover.

I was able to test the circuit in the attached diagram, but only with one battery. With just that one, the initial charge voltage dropped to 22.5V. Since the charger is limited to 50A, adding a 2nd battery would drive the voltage down even more. Ideally, I'd like to predict how low the initial DC voltage will be across the load when both batteries are in the circuit. Any further ideas are greatly appreciated.
 
Last edited:
The voltage across the load will be exactly the same as the voltage across the batteries, i.e. 13V when the batts are fully depleted, if 13V is your criterion for deciding depletion.
 
Right, I understand that.

Because the voltage across the load will be the same as the voltage across the batteries, I am concerned with how low the voltage will drop when both batteries are in the circuit and the 50A charger is turned on.
 
Last edited:
Depending on the chemistry of the battery, if it is lead-acid then discharging a 24v battery to 13v is a good way to kill it quite quickly.

How will turning on the charger cause the battery voltage to drop?
The charger should raise the battery voltage.

JimB
 
Last edited:
I think my description is confusing. Sorry about that. Let me describe it this way:

Under normal circumstances, the two 24V batteries are kept at a start of full charge and the AC/DC converter happily runs the system at about 28V. The batteries pull almost nothing and the load pulls about 25A or so. Well under the 50A limit.

The situation I'm concerned with is when the generator quits, which turns off the AC/DC converter. The batteries support the load for several hours, but eventually they go dead. The generator is then turned back on. The DC voltage will be driven down well below the typical 28V in order to maintain the 50A limit. That's the voltage I'm trying to determine.
 
If I understand correctly:

The PSU/charger can supply up to 50A.
The normal load is 25A, so 25A is available to charge the batteries.

But whan the batteries are discharged, they demand a charge of 30A, the load is 25A. The PSU cannot supply 55A and its output voltage will drop.

If this is the case, I think that you should be looking at a different configuration, either a current limiter on the battery charge circuit, or a separate charger from the main 50A PSU.

JimB
 
If this is the case, I think that you should be looking at a different configuration, either a current limiter on the battery charge circuit, or a separate charger from the main 50A PSU.
Agree completely. And that's actually the end goal of this investigation - to recommend a power system redesign.

In order to get to that point, though, I'd like to be able to present some calculations that show "how bad" the situation is. We've already shipped out a couple units with the current design. Apparently, the original engineers never considered what would happen if the generator conked out. So the powers that be need a lot of convincing to understand what needs to be done. I figure the more details I can show, the better.
 
Well, I did some PSpice-ing and came up with a number. I went back to my test setup and got some better values (the values in my original post were recalled from memory.... a very poor memory). The battery was actually at about 15.3V at the point we considered it "depleted". When power was turned on, the DC output was 21.5V. I figured out the equivalent resistance of the load at 0.88Ohm. I set up a circuit with a single battery, a 21.5V source, and the 0.88Ohm load. I tweaked the internal battery resistance until the currents looked pretty close to what I was measuring. The internal resistance came out to be 0.24Ohm.

I then added a second battery with exactly the same properties. This time I adjusted the DC voltage of the supply until the supply current was 50A. The end result was 18.85V. The attached screen capture is the final circuit that I think should approximately predict the numbers I was looking for.
 
OK, I understand your problem!

I think you are approaching the problem from the wrong side.
I think you need to look at the load characteristics of the PSU.
Load it up to 50A and beyond and measure the voltage.
Only then will you be able to determine voltage available, it is the PSU which is losing output volts, not the battery.

JimB
 
If a 24 volt SLA battery is at 13.5 volts it's beyond "depleted". About 21 volts should be your low voltage limit for discharge for normal lead-acid cells.
 
JimB,
I'm not entirely sure I follow you. When I turned on power to my test setup (one battery and the load in parallel), the DC PSU immediately pegged at 50A. So it was loaded as much as it could go. In order to keep the current at 50A, it has to lower its output voltage to abide by Ohm's Law. And if I put a 2nd depleted battery in the circuit, it'll sink even more current, which'll cause the PSU to drop its voltage even more. So I'm following you there, but I'm not sure what you're suggesting.

nsaspook,
Well, it was actually 15.3V, not 13.5 (see post #11 above) - although that doesn't make it much better. I'm intentionally creating a worst-case scenario. We were able to get the battery down to 15.3V in our test setup, so we have to assume the customer will also be able to run the batteries down that low.
 
Last edited:
JimB,
nsaspook,
Well, it was actually 15.3V, not 13.5 (see post #11 above) - although that doesn't make it much better. I'm intentionally creating a worst-case scenario. We were able to get the battery down to 15.3V in our test setup, so we have to assume the customer will also be able to run the batteries down that low.

Sorry, mistyped the voltage.
Discharging it much below 20 volts with a load even once can damage the battery causing cell reversal and a internal electrical short. You need a low voltage disconnect long before 16 volts.
 
Last edited:
Yeah, that's actually going to be my primary recommendation: stop the batteries from ever getting so low in the first place. But that's for the powers-that-be to decide whether they want to make the hardware change.
 
Yeah, that's actually going to be my primary recommendation: stop the batteries from ever getting so low in the first place. But that's for the powers-that-be to decide whether they want to make the hardware change.

At least put in an annoying (none silence-able :p) warning signal.
 
Status
Not open for further replies.

New Articles From Microcontroller Tips

Back
Top