Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Battery charging time and charging current?

Status
Not open for further replies.

Flyback

Well-Known Member
Hello,

I have a battery of NiCd's which is 6V nominal.

I wish it to supply 0.714 Amps for 3 hours.
-this is 3*60*60*0.714 Coulombs = 7714 Coulombs.

If i am to charge this battery up for this service, and i have a 19 hour charge time
then would i be right in saying that i need a constant charger current of 112mA?
 
With NiCads, the charging efficiency is very poor. You must put in 140% to get back 100%. The other issue is charging termination. There is not an easy way. The terminal voltage of a NiCad says little about its State of Charge. Fast Chargers use temperature rise. By the time the battery gets hot, you have already begun to damage it. Medium rate chargers have a charge rate which are less damaging, but must be turned off based on time. Low trickle charge rates supposedly can be tolerated indefinitely, but in my experience they eventually dry out the electrolyte, too.
 
Most battery charger ICs detect the voltage "hiccup" produced when a Ni-Cad or Ni-MH battery is fully charged. Then they turn off.
Here is what the voltage vs amount of charge looks like:
 

Attachments

  • Ni-Cad and Ni-MH voltage.PNG
    Ni-Cad and Ni-MH voltage.PNG
    25.8 KB · Views: 169
Hello,

I have a battery of NiCd's which is 6V nominal.

I wish it to supply 0.714 Amps for 3 hours.
-this is 3*60*60*0.714 Coulombs = 7714 Coulombs.

If i am to charge this battery up for this service, and i have a 19 hour charge time
then would i be right in saying that i need a constant charger current of 112mA?



Hello there,


That works if your battery or battery pack is around 1.5 ampere hour. But if it is 2.1 ampere hour then it doesnt work as MikeMI said. So if you havent already figured in the charge acceptance of a NiCd then you have to do that first.

It's very simple but the way to start is with the ampere hour rating of the battery or battery pack. You take the ampere hour rating, multiply by 1.4, then divide by the current. So a 1.5 ampere hour battery being charged with a 100ma power supply would be calculated as:

x=1.5*1.4=2.1 ampere hours (including the charge acceptance for the battery)
t=x/i=2.1/0.1=21 hours

So you'd have to charge for 21 hours with 100ma.

Notice that in the above we start with the Ampere Hour rating of the battery and use that as the basis for calculating the charge time with any current source.

Also, you mention that you want to use a "constant charger current" of 112mA. That fine, but you dont really need a constant current. The current can vary over time so the circuit can be simpler than an actual constant current circuit would be. The only difference is that instead of the simpler division by the constant current, we have to sum all the shorter time charging with a current that changes gradually from some min to some max. Why we go through all this is so that we dont have to actually build a constant current charger which is much more complicated than a simple DC wall wart.

The new calculation is similar to the way we did it before, with a slight modification...

First, x is still equal to the ampere hour rating of the battery times the charge acceptance of a NiCd:
x=1.5*1.4=2.1 ampere hours.
That doesnt change. But because the current now starts out higher and ends up lower we have to sum up all of the incremental ampere hours that change over time. Luckily, doing this results in a very simple formula if the ampere hour increments change linearly over time. Unfortunately, they dont, but lucky for us again they dont change by that much in most cases so we can still use a simpler formula to compute the time to charge:
t=2.1/Iavg
where
Iavg is the average current during the total charging period. Iavg is easy to calculate too:
Iavg=(Imax+Imin)/2

So what we end up with is:
t=2.1*(2/(Imax+Imin))

A good question at this point is how do we determine Imax and Imin. Luckily again this isnt hard to do either. We have to charge one time and measure the current when we first start to charge, then the current once the batteries are surely charged (after a long enough time to ensure a full charge). So the starting current may be 150ma and the ending current 50ma, which makes the average current 100ma over time. Using these two in the formula we have:
t=2.1*(2/(0.150+0.050)=2.1*2/(0.200)=2.1*10=21 hours.

So you dont need a constant current charger if you dont mind doing just a little more work for the calculation.
 
Thanks MrAl, do you have an application note , or document, which backs up what you are saying?..its just so that i can show the boss.
 
If you do not limit the charging current then the "wall wart" must be huge enough so it doesn't catch on fire.
What happens if the battery is completely dead or has a shorted cell or more? LOTS of current!
A current limiting resistor or circuit is less expensive than a huge wall wart.
 
Thanks MikeKI, i just read that, but it didnt really state the relationship between charge input needed and the actual mAh rating of the battery....

it only gave this very handwaving description....

The coulometric charging efficiency of nickel cadmium is about 83% for a fast (C/1 to C/.24) charge, and 63% for a C/5 charge. This means that at C/1 you must put in 120 amp hours in for every 100 amp hours you get out. The slower you charge the worse this gets. At C/10 it is 55%, at C/20 it can get less than 50%. (These numbers are just to give you an idea, battery manufacturers differ).

.....Please tell if you know of more exact info?
 
This is the year 2013.
Why don't you use a battery charger IC that limits the charging current then shuts off when it detects that the battery is fully charged?
 
Why don't you use a battery charger IC that limits the charging current then shuts off when it detects that the battery is fully charged?

....because our charging current comes from the secondary of a resonant inductive coupler, which gives a fixed 260mA current (this level can be changed at the production stage).
The coupler current can be switched on and off, but it cannot be increased.....so we are stuck with this fixed charging current.

Battery full charge detection relies on Negative dv detection, and with our 4Ah battery, we are charging at too low a rate for NDV to show up.

There are temperature methods of detecting end of charge, but again, with our low charge rate (<C/10), the battery will not heat up at all.
 
... it didnt really state the relationship between charge input needed and the actual mAh rating of the battery....

it only gave this very handwaving description....

.....Please tell if you know of more exact info?

You will have to get data specific to a selected battery from the Manufacturer of the battery.
 
Thanks MrAl, do you have an application note , or document, which backs up what you are saying?..its just so that i can show the boss.

Hello again,

You probably wont find a data sheet with this kind of information on it because it is just too basic to require too much to be written on it. The battery is just a device that takes charge from a source and stores it, and the basic theory says that the charge in Ampere Hours stored is just the current in amps times the time in hours. But what if we dont charge it all at once, but do it half now and half later? Say we have a 10 Ahr battery and we supply 1 amp for 5 hours, that's only half the storage capacity, but then later in the day we supply another 1 amp for 5 hours (not including the charge acceptance factor for simplicity). Now we've got the full 10 Ahr in the battery even though we didnt do it all at once.
But then what if we break it up into 1 amp for 1 hour, and do this only one time per day for 10 days. That's still 10 ampere hours. But we dont have to do the same current either. We can do 1 amp for 1 hour, 2 amps for 1 hour, then 1 amp for 2 hours, then 1 amp for 5 hours. That's still 10 ampere hours.
So the total charge into the battery is the sum of charges over time, assuming there is no drain on the battery between or during the charge periods.

So we sum up the charge over time and it doesnt matter what the current is as long as it is above some minimum level that exceeds the self discharge of the battery.

[LATEX]Q=\sum_{k=1}^{N} i(k)*t(k)[/LATEX]

That's the approximation and is accurate if all the t(k) are small.

In the limit the approximation becomes exact, but we often break the time up into discrete increments. So just for reference in the limit we get:
[LATEX]Q=\int_{t_{0}}^{t_{max}} i(t) dt[/LATEX]

Keep in mind that those equations show the charge after a given time but they assume that there is no previous charge in the battery at the start of the charge period. If there is a charge to start with, that gets added to either equation being used.

The units for batteries being charged is often in Ampere Hours, so that means all currents are expressed in Amperes and all times in Hours.


From your more recent writing it sounds like you are using a constant current source to start with, so you dont need to worry about this too much. If you have to charge a 4 Ampere Hour battery then multiply that by 1.4 and divide by the charge current and you have the time needed.
 
Last edited:
Thanks MrAL
If you have to charge a 4 Ampere Hour battery then multiply that by 1.4 and divide by the charge current and you have the time needed.

It would be nice if there was a formal proof paper of this so i could show the boss?

Would it be OK to trickle charge a 4Ah , 5 series cell , Nicad battery at C/10, continuously?.......and just ensure i never go above 1.6V/cell?

...or would it overheat?

..would i have to use battery temperature monitoring, to detect end of charge, and to stop the battery overheating.

I appreciate that negative dv is a waste of time with C/10 charging, because at C/10, the dv will be too small?
 
Thanks MrAL

"If you have to charge a 4 Ampere Hour battery then multiply that by 1.4 and divide by the charge current and you have the time needed. "

It would be nice if there was a formal proof paper of this so i could show the boss?

Would it be OK to trickle charge a 4Ah , 5 series cell , Nicad battery at C/10, continuously?.......and just ensure i never go above 1.6V/cell?

...or would it overheat?

..would i have to use battery temperature monitoring, to detect end of charge, and to stop the battery overheating.

I appreciate that negative dv is a waste of time with C/10 charging, because at C/10, the dv will be too small?


Hello again,


What do you call a formal proof? This concept is so basic to battery charging that i dont know if anyone ever bothered to figure it out formally in the past 20 years but there is a way to prove it yourself if you are interested. More about that in a minute.

Charging a 4Ahr cell at C/10 i presume you mean 400ma indefinitely. That would work, but it would reduce the life of the cells. It's often quoted that NiCd's can take the constant overcharge (as that is what that is doing) but when it is for very long periods it does in fact reduce the life. NiCd's might last 5 years or more if charged correctly, but with a constant overcharge like that they may only last a year or so.
I never recommend overcharging except for the short term. That' a couple hours or so at a low enough current to not overheat the cells. The temperature may rise a bit, but as long as this condition doesnt last too long it should be ok. NiMH cells have a little more difficulty with this, but supposedly the newer designs can handle overcharge a little better than the older designs too.

Yes you are correct in assuming that C/10 charging is probably too small to reliably detect the minus dv/dt end of charge indicator response.
You can use battery temperature monitoring if you like, but you do have to ensure that the temperature probe always stays in contact with the surface of the cell otherwise you're monitoring the room temperature :) That's one of the drawbacks to thermal measurements like this. It's also good to make sure the circuit response to an error in thermal sensor looks like overheating rather than a nice cool cell temperature.

If you drain the cells all the way down before using you can then assume that they need the full length of time to charge. But if they are not drained down, then you cant really charge as long as when they are drained down all the way. That's because the cell will go into overcharging sooner.

About the cell charging tests...
If you wanted to you could charge the cells up fully, then drain them down under controlled conditions and calculate the Ampere Hour rating. You could then experiment with different charge times to get an idea what the more exact acceptance factor should be.
If you intend to use temperature monitoring however then you dont need to do this.

There are various other websites which will show the same thing about charging times and acceptance factors. I think Battery Space might be one of them. You could browse around the web for others too, there must be many out there.
 
Seems like you need to use how to use Google...

**broken link removed**is another, slight more verbose, if not more rigorous.
 
our charging current comes from the secondary of a resonant inductive coupler, which gives a fixed 260mA current

But earlier you said you wished to charge it with over 700 ma. :confused:

It kind of sounds to me like someone already decided to charge it with 0.06C and just let it trickle charge. Probably the only choice. :rolleyes:
 
MikeMI,
I wish it were on google.........."what is the rate thats acceptable for continuous trickle charging of nicad cells, and can be done without battery temperature monitoring" is not on the web anywhere.

As you will know, -Its no surprise, this will be the industrial secret of those who are moneymaking from it, as we expect.

Sorry ronv i dont remember the 700mA.....but yes, the resonant inductive couplers can be manufactured to produce a variety of constant current levels....and we are trying to decide which one to use for our nicad charger.

the output current of a R.I.C. is a rectifies 50KHz sine wave...so 100KHz, so its constant, but , pulsating , as such.

The trouble with 0.06C is that it wont get us fully charged within 19 hours, which is the standard for emergency lighting.
 
Hello again Flyback,

You could be right there, because for many years a common household product came with instructions to leave the charger plugged in 24 hours a day. That of course kept the wall wart based charger charging the NiCd's continuously which resulted in over charging. The batteries lasted a year and then stopped holding a charge for the most part. They also like to state that continuous charging of a NiCd is not harmful to the battery. But dead batteries dont lie. OR should i say, batteries that cant hold much of a charge any longer dont lie.

The more acceptable recommendation for 2000mAhr NiMH cells is 250ma, but again i would not keep that too long. NiCd are supposed to be more rugged for overcharging, but one thing we know for sure: no overcharging is better than any overcharging.

If you monitor the current out of the battery over time you can compute the ampere hour draw, then you can multiply by 1.4 and charge accordingly. That's one way to do it. This would take microcontroller processing power which these days is pretty cheap.

Just to note, if you have a pulsating charge current then you have to take into consideration the pulse shape to establish an average charge current. If you know the average charge current already then you're one step ahead.

If you have the ability to attach temperature monitoring sensors, then by all means do it. It's hard to go wrong that way except of course for the mechanical aspects where you need to make sure the sensor stays connected to the cell(s).

NiCds are being phased out of most products these days anyway, in favor of either NiMH or Li-ion.
 
Don't forget that a Ni-Cad AA cell had a capacity of only 600mAh but a Ni-MH AA cell is 2500mAh which is more than 4 times higher.
Then a charge of C/10 for the Ni-Cad is a current of only 60mA and 60mA for a Ni-MH cell is only C/42.

Energizer and a few Japanese Ni-MH cell manufacturers recommend a trickle charge of C/40. Then the cells get only barely warm.
 
Status
Not open for further replies.

Latest threads

Back
Top