battery charging oscillation loop

Status
Not open for further replies.
I have a simple 12 volt battery backup device. As long as power is being supplied via an AC / DC adapter, the relay keeps the backup from working.

The circuit is being powered by a 3000 mAH ni-cad battery pack. The sensing part works great and indicates when the battery voltage drops to about 10 vdc (I can change this). What I would like to do is have the sensing circuit switch to charging the batteries. Here is what I understand. If the sensing circuit switches power from the adapter to charge the batteries, then it will now "see" a full 12 volts and stop charging. Then, re-see that the batteries are low and turn the charging back on. An oscillation loop. How do you prevent this?
 
Look up hysteresis, which any such system should have - however, the battery shouldn't instantly go to 12V when you start charging it, it will gradually increase (which is an indication of it's charge level).

You also need to be careful how you charge the batteries, for both safety reasons and to prolong battery life.
 
First, I have not done anything with regards to charging. Next, what is hysteresis? Nothing on the web explains this easily to me. I know that the batteries need to charge SLOWLY. The adapter that I was going to use had an output of 1 amp. I read that the batteries needed to be charged at a maximum of 10% of the mAh of the battery. I don't really care how long the charging takes. The adapter I have now is only 50 mA - just enough to pull the relay contacts in. I do realize that this would potentially take days to charge the battery - I don't care. Does anyone else have anything that will help me?
 
Thank you so very much for your research. This was not sarcastic as I know that this took you time. This explanation was spot on! I understood the concept perfectly as I have experienced this personally. Now, how does this help me with preventing an oscillation loop when I go to getting this battery pack to charge? As far as I still understand, slowly ramping the voltage up to the charging state will STILL cause this to happen. It will happen because the batteries will not charge correctly until the proper voltage is applied across them. Then when that happens the sensing circuit will cease charging. Now that charging has stopped, the same circuit will sense the low level battery and charging will again start by ramping up (as before). I'm sorry but this don't help.
 
I think that you need a better understanding of the charge cycle of NiMH cells.

The nominal voltage of a NiMH cell is usually stated at 1.2 volts, the cell is considered to be fully discharged when the voltage is down to 1 volt.

However, during charging, the cell voltage will rise to over 1.5 volts.
The charge curve of a NiMH cell is maybe a bit odd in that at the end of the charge period the cell voltage rises and then falls, creating a hump in the charge curve.
One way to tell when a NiMH cell is fully charged is to detect the peak of the hump and then either turn off the charge current, or, switch to a low trickle charge current.

Your idea of charging a NiMH battery over several days with a small current, in my experience, this is a bad idea.
NiMH cells need to charge at the C/10 rate, or maybe a bit faster if carefully monitored.
Cells charged slowly often do not charge well.

I have written about this hump in the charge curve before, look here:

JimB
 
Ok, I have 4 - 12 volt battery packs. Where are you getting 1.5 volts from? I told you that I read that these packs needed to charge at that rate (C/10). I don't have the capability to do this. Don't bother trying to help anymore - you just burst my bubble. I'm ******* sorry I asked.
 
I'm not done. Holy ****! You'd think that I was trying to launch a rocket ship to the far side of MARS! Then, my original question was NEVER answered! I read about charging these batteries BEFORE I designed this circuit. To do charge them in the way I'm being told, I would have to put a 17 ohm resistor in parallel with the battery to limit the charging current to 300 mA. Now this is going to waste 700 mA and create almost 12 watts of un-needed heat. What in the HELL am I going to do with that? If you guys want to delete my account here - ******* go ahead.
 
If you have no hysteresis and you set the end of charge detector set to 14.000 volts (For example.) when the battery reaches 14.001 the charger will be switched off. The battery voltage will start to drop. It will quickly drop below 14.000 volts so the charger will switch back on. this sequence will repeat until something is done to stop it. This is oscillation. It would be much better if the charger did not switch back on until the battery voltage dropped to say 11.000 volts. This behavior can be achieved by adding hysteresis to the voltage detection comparator.
It did not take long to find a good description of hysteresis in the electrical context. Just searching for "hysteresis" gave descriptions that were too general that would have confused you. Searching for "electrical hysteresis" gave the link I posted from the first page of results.

Les.
 
Where are you getting 1.5 volts from?
That is the voltage of the individual cells.
A Battery is made from a number of cells, a 12v NiMH battery will usually have 10 cells. During charging the battery voltage can rise to about 15 volts.

Don't bother trying to help anymore - you just burst my bubble. I'm ******* sorry I asked.
Similarly, I am sorry that I tried to help.

JimB
 
Then, re-see that the batteries are low and turn the charging back on. An oscillation loop. How do you prevent this?
I'd say you need two voltage comparators plus a timer, none of which is all that complex or expensive.

One comparator that triggers at the low voltage discharge level and switches the charge system on.
And the second switching at the full charge voltage and enabling the timer.

Once the timer expires, it turns off the charge.

That allows full charge then some trickle charge time, which is pretty essential with NiCd or NiMH batteries, to equalise the cells and maintain them in good condition.


For the simplest practical and moderately fast NiMH "charger" setup, you need a DC power supply a volt or two higher than the full charge voltage of the battery.

You connect that to the battery via a rectifier diode and series [power] resistor.
The resistor value needs calculating so when the battery is at its full charge voltage, the current is just under C/20

That means when the battery is at its minimum voltage, the charge current will be rather higher; the less difference to the power supply at full charge, the greater the initial current increase.

eg. Using a 16V supply: Approx. 0.6V diode voltage so 0.4V across the resistor and 150mA C/20 gives about 2.7 Ohms; I'd try three ohms.
At minimum charge, 10V, that means 5.4V across the resistor and an initial charge of about 1.8A
The resistor would need to be rated at 10W or higher.

The one thing you cannot do is directly use a DC supply with no current limiting, or one lower than the battery full charge voltage. The battery or power supply would be damaged, eventually if not immediately.
 
This was meant to be a simple circuit. I cannot use a transistor to switch the load, the relay is NOT meant to be the switching source for charging - although it would have been nice. I am only a simple experimenter and do not have the knowledge you guys do. I got "pissed" off because everything you were trying to explain to me went right over my head. I didn't want this to turn out to be such an elaborate project. I'm not using 14,000 volts - just 12. The battery packs only have 3 cells not 10. The descriptions for the packs say they are ni-cad. I know that I cannot over or undercharge the batteries. I cannot deal with 10W of heat - why do I have to waste that anyway? 1.8 amps to charge a 3000 mA battery pack! - I don't think so Tim! I have read (like Jin said) C/10 is the proper calculation not c/20.
I have STOPPED following this. None of it is making any sense. I'll buy a pre-built charger.
 
The battery packs only have 3 cells not 10.
BE CAREFUL

A three cell battery is either not 12V, or it is using lithium cells - which need incredibly critical charging, with cell balance control, or they may burst / catch fire!

For info, C/10 is the "14 Hour" charge rate for NiMH or NiCd cells, not the only current allowed.

They can be charged much faster IF they are not overcharged at all; eg. Energiser say C/2 is fine, C/1 should be used with caution - but anything using that current continuously must have intelligent charge termination.

Or they can be charged slower. C/20 is the maximum permitted "long" charge, eg a couple of days, and C/40 can be applied forever to NiMH cells.
Nickel based cells are not harmed by a permanent low current trickle, it can improve their condition.

(And none of that applies to lithium - you cannot overcharge them in the slightest without making them unstable).

If you want to independently check charging requirements for any type of battery / cell, have a look at the "Battery University" site:
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…