Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Just a sanity check on transformers

Status
Not open for further replies.

BrianG

New Member
I want to apologize up front for the long post, but want to make sure all the details are provided.

I'm designing a true constant-current AA NiMH charger, and want to select a power supply that will be just right. I want a high enough voltage to that the CC transistor doesn't saturate, yet low enough voltage to reduce the power dissipation in the transistor.

I plan to build a unit that will charge two seperate banks of 2 NiMH cells in series at 0.3A (0.7A total for both banks), so I will be using two cc transistors in their own circuits, one for each 2 cell bank. NiMHs can reach as high as 1.6v (3.2v for 2 cells) before they peak, so I figure I want right around 6vdc.

My first question: I plan on using a 12.6v c.t. 1A transformer with full wave rectification. This should net me 6.3v @ 2A right?

Next question: I am a little worried about the higher voltage this will produce. If this transformer provides 6.3v @ 2A, the smaller load I will be placing on it (~0.7A total) will cause the output to be higher than 6.3v. After rectification and at 2A, the output should be 7.92v (figuring rectifier 0.7v loss). But since I will only be drawing ~0.7A total, the output will be closer to 9v DC, if not higher.

I know I could regulate the supply and feed the rest of the circuit true 6v, but I thought that splitting the thermal load across the two cc transistors would be preferable (only have to heatsink the two transistors). If I regulate it first, the cc transistor disspation will be lower true, but will still need to be on a heatsink, as well as the regulator. Same thermal load, just split differently. What would you do?

Incidentally, the circuit I came up with works quite well. It charges at the specified 0.3A until cell temperature reaches ~100*F (via thermistor). At that point, the circuit cuts the charge and latches "off", so even when the cells cool, the circuit will not start charging again. You reset the latched circuit via a momentary N.C. switch. I'll provide the schematic once everything is worked out powersupply-wise if anyone cares. I'm using my regulated bench supply right now, but I want the end result to be portable, therefore small, and want it to have its own built-in supply.
 
Last edited:
My first question: I plan on using a 12.6v c.t. 1A transformer with full wave rectification. This should net me 6.3v @ 2A right?

The transformer will not change maximum output current unless it has two secondary windings connected parallel. With a center tapped transformer you can't do it. The secondary windings must be separate!

Use a 10VA transformer with either a single secondary winding of 9V/1.1A (or 2X9V, 2X555mA, e.g. Block UI39/8 209)

Next question: I am a little worried about the higher voltage this will produce. If this transformer provides 6.3v @ 2A, the smaller load I will be placing on it (~0.7A total) will cause the output to be higher than 6.3v. After rectification and at 2A, the output should be 7.92v (figuring rectifier 0.7v loss). But since I will only be drawing ~0.7A total, the output will be closer to 9v DC, if not higher.

Use an LM317T adjustable voltage regulator and adjust it to the deisred voltage. It regulates pretty precisely independent of the load (minimum 10mA).

I know I could regulate the supply and feed the rest of the circuit true 6v, but I thought that splitting the thermal load across the two cc transistors would be preferable (only have to heatsink the two transistors). If I regulate it first, the cc transistor disspation will be lower true, but will still need to be on a heatsink, as well as the regulator. Same thermal load, just split differently. What would you do?

Nothing, if you use the suggested regulator. It works alright up to 1.5A.

Boncuk
 
The transformer will not change maximum output current unless it has two secondary windings connected parallel. With a center tapped transformer you can't do it. The secondary windings must be separate!

Use a 10VA transformer with either a single secondary winding of 9V/1.1A (or 2X9V, 2X555mA, e.g. Block UI39/8 209)

Ok, thanks. I somehow got it stuck in my head that if a transformer was rated 12.6v @ 1A (12.6w), that wiring the secondary for full wave rectification would reduce the voltage by half, but increase the available current by the same amount, so 2A.

That actually works more in my favor since 0.7A load is pretty close to the 1A rating, so the transformer will be loaded more, and the voltage will be closer to the area I want. I might not even need regulation. And actually, if the voltage is close enough, I won't be able to use a regulator since dropout will become a concern.

Use an LM317T adjustable voltage regulator and adjust it to the deisred voltage. It regulates pretty precisely independent of the load (minimum 10mA).

Yeah, I already was playing around using a 7805 regulator and adding resistors to boost it to ~6.5v.

Nothing, if you use the suggested regulator. It works alright up to 1.5A.

Boncuk

I was more concerned with power dissipation in a small area. If the unregulated voltage was 9v, and I wanted 6v, the dissipation at 0.7A would be 2.1w. I will have the heatsink mounted outside the case, but again, the area is pretty small to make this project portable. I'm looking for something about the size of commercial 4XAA chargers, just with a better "algorithm", so proper heatsink space is at a premium.

Thanks!
 
Last edited:
Your 6.3V output will be capable of a little less than 2A because the I^2R losses are higher than you expect. Each half of the winding is only providing half wave, so the copper losses are about the same as if you had simply put a bridge on one of the 6.3V windings. (You do save a diode drop, though). Power rectifiers often habe a 1V drop unless you've seriously overrated them, so this helps your numbers.

But maybe you don't want to save a diode drop, since your voltage is too high anyway. The rectifier needs a capacitor to actually achieve the 1.414 multiplier, and at 700mA you need 6000 uF to get 1V ripple. I'm assuming your circuit wants fairly smooth DC. So, ((6.3 * 1.414) -2) = 6.5V with ripple down to 5.5v (average 6.0). 350mA each at 3.2V means your CC transistors will be (0.35 * (6.0-3.2)) = 0.98 watt (each). Toasty but if it's a TO-220 you only need a small heat sink to bring it into sanity. You can save some power by using a smaller capacitor: 3000 uF would give you 2V of ripple and average DC of 5.5V (4.5 to 6.5).
 
That's a good idea! I don't really need totally clean DC for most of the circuit, so the reduced duty cycle from a little ripple will help dissipation. I'm using a couple TIP120's (Darlington NPNs) for the CC stage, so yeah, they're TO220 cases.

BTW: allelectronics.com has a decent price for a 12.6v 1A transformer, so that's why I was looking at that one specifically. Looks like it should work nicely. A little physically large maybe, but certainly do-able.

Also, how are you figuring the amount of capacity needed for a certain current with a certain voltage ripple? Where you say "at 700mA you need 6000 uF to get 1V ripple). I've always just used the rule-of-thumb 1000uF/amp.

Thanks guys!
 
1 farad will give a 1V drop in 1 second with a 1A load. 8 millifarads (8000uF) will give 1V drop in 8 milliseconds with a 1A load.
 
Just an update on this:

I ended up using an external "wall-wart" as the power supply. To get the 6v DC I needed, I just used a switching R/C BEC modified slightly for a solid 6v at any load. This allows me to use any DC input voltage from around 8v up to around 30v as long as the polarity is correct and the wall-wart has enough current (less current for higher voltages).

This supply was for creating a portable battery charger for some low self-discharge NiMH AA cells. I don't like the common chargers found at most stores as they are timed, or worse yet, just use a current limiting resistor, and so can undercharge or overcharge. So I made one using a temperature termination scheme.

Here's the schematic. Sorry, it's just a hand written one from my notebook.

**broken link removed**


Circuit operation:

R7 and D3 (LED) together puts ~2v at the base of Q3. That makes ~0.75v at the emitter of Q3, which generates a constant battery charging current of 0.313A through R8. The 313mA charge rate will take about 7 hours to charge the 2200mAh cells I have. That may seem like a long time, but ~0.3A seems to be the best charge rate for the Turnigy AA LSD cells I got. Diode D4 prevents the battery from backfeeding into the rest of the circuit if the supply is disconnected while the cells are still inserted. The LED (D3) lights up when charging and shuts off when charging is complete, so it is used for charging status as well as charging bias.

Thermistor R1 (100k at 25*C) is positioned to physically touch the batteries. I also molded some thermal epoxy on the thermistor to conform to the cells for better physical contact. As the batteries reach full charge, they start heating up. This causes the resistance of the thermistor to decrease. The voltage divider consisting of R1 (thermistor), R2, and R3 generates a higher and higher voltage at D1, until it can overcome the diode and turn on Q1, which brings the LED (D3) voltage to ~0.8v shutting off the charging transistor (Q3).

When Q1 is turned on, Q1 and Q2 transistors basically latch each other on and will keep the charger shut off until power is removed. Kinda like an SCR. This is to prevent the circuit from charging the batteries again once the thermistor cools. You use the normally-closed momentary switch (SW1) to cut power momentarily which resets the latch circuit. C1 just makes the circuit stable when resetting the latch.

I chose 6v because it was high enough so that the charging transistor will not saturate near end of battery charge (where its voltage is highest), yet low power dissipation.

To calibrate the thermal shutdown, I started with the trim pot resistance at the minimum value. Then, while charging, I monitored the cell voltage. When I saw the voltage peak and then start to decline (which signifies end of charge), I adjusted the trim pot slowly until the charge circuit shut off.

Some pics of the finished product. Yeah, it's not pretty but it does work well, and I didn't feel like fabbing my own enclosure for better aesthetics:

**broken link removed**

**broken link removed**
 
Good project.

It seems like temperature sensing is the easiest way to do it.

How hot does the battery get before the charge is actually terminated?
 
Good project.

It seems like temperature sensing is the easiest way to do it.

How hot does the battery get before the charge is actually terminated?

surely it is more of a case of how hotter than the ambient does the battery have to get to be charged, I think I once saw a similar kit that used two sensors to acomplish this
 
That's probably a better approach, maybe a thermocouple could be used?
 
Good project.

It seems like temperature sensing is the easiest way to do it.

How hot does the battery get before the charge is actually terminated?

Not really sure how hot the cells get. To calibrate the unit, I charged a set of cells while monitoring the voltage with a meter, and when the cell voltage peaked and then dropped back down a few mv, I set my trim pot to stop the charge. After charging around 15 pairs of cells, a "hand test" for temperature seems to be on par (maybe a tad less) with how hot the cells get when using my true delta voltage peak NiMH R/C chargers. At a guess, around 90-95*F.

Temperature sensing works well and is easy, but I would rather have used a true delta voltage peak scheme. I tried experimenting by using a comparator. One input was hooked directly to the battery positive terminal, the other input was also hooked to the cell, but through an isolation diode that charges a capacitor. I figured the cap would "store" the highest voltage peak. However, I found that when the battery voltage started to drop at the end of the charge, the diode's reverse leakage current was very slowly discharging the cap so it wasn't holding a constant voltage. At the slow 0.3A charge rate I use, the delta peak happens slower than the cap discharges through the diode leakage so the comparator never shuts off the charge circuit.

surely it is more of a case of how hotter than the ambient does the battery have to get to be charged, I think I once saw a similar kit that used two sensors to acomplish this

I thought of that, but since I plan to use the charger in ambient temperature conditions that do not fluctuate a lot (mostly in 70-75*F room temps), I can get away with ignoring ambient temps. Obviously, if this was a marketed product, I would definitely compensate for high or low ambient conditions.

That's probably a better approach, maybe a thermocouple could be used?

I suppose it could. I was just trying to find components that were very cheap to acquire that would work.

I did have one hiccup during the design process though. When I had the prototype circuit on my experimentors board, the 2N3904 transistor failed at one point. When I tested it, it had 0 gain. Not sure why this happened as all voltages and currents are well within limits; voltages at or less than 6v and worst case currents well under 20mA. I replaced it and the circuit worked fine thereafter. When I built the final circuit in the enclosure, I put both 2N390X transistors in sockets to make replacements easy, but after charging lots of batteres, everything still works fine. I'm just curious why the transistor failed. Could it have been "weak" from the manufacturer?

I'm really liking these low self-discharge NiMH cells though. I have a lot of devices that have a slow enough drain that regular NiMH cells self-discharge faster than the load discharges them. Perfect for things like TV remotes, cordless PC keyboards/mice, Wii controllers that don't see a whole lot of use, R/C transmitters, etc. In case anyone is interested, **broken link removed** are what I use. Very inexpensive if you buy enough to compensate for high overseas shipping. I was a bit leary of using Chinese cells (never sure how their QC is), but out of 30 cells I got, all perform equally. A 300mA charge rate seems to work the best. Higher rates need a really sensitive delta-peak or they'll get too hot too fast when charging, and slower rates just take too long. Real-world capacity meets or exceeds the published spec of 2200mAh as long as they are used in lower-drain applications around 0.3C (600-700mA).
 
Temperature sensing works well and is easy, but I would rather have used a true delta voltage peak scheme. I tried experimenting by using a comparator. One input was hooked directly to the battery positive terminal, the other input was also hooked to the cell, but through an isolation diode that charges a capacitor. I figured the cap would "store" the highest voltage peak. However, I found that when the battery voltage started to drop at the end of the charge, the diode's reverse leakage current was very slowly discharging the cap so it wasn't holding a constant voltage. At the slow 0.3A charge rate I use, the delta peak happens slower than the cap discharges through the diode leakage so the comparator never shuts off the charge circuit.

Use a dv/dt sensor - a comparator with hysteresis, with one input connected to directly to the battery and the other via an RC circuit.

See the attached schematic.

The output will transition from low to high or high to low (depending on the input) when the dv/dt setting is exceeded. The output will default to high when the power is initially applied, assuming the capacitor voltage is near 0V, so you shouldn't have to worry about the charging dv/dt.

As the battery charges the output will be high, when it's fully charge the input voltage will start to drop which should cause the output to go low. If the voltage change is too low, increase the RC time constant, you'll probably have to experiment to get it working.

Note: I've never done this before but it should work.
 

Attachments

  • dv dt detector..PNG
    dv dt detector..PNG
    581 bytes · Views: 247
Last edited:
Ni-MH cells are made in Japan for Energizer (maybe by Sanyo). They are all low self-discharge now. The AA cells are 2450mAh.
I get coupons online or from a local lumber store then I buy the cells when they are on sale at a food store.
 
Hero, that circuit is VERY similar to what I tried. However, as the battery voltage started to dip, the cap was slowly discharged. Like I said, I tried to use a diode on the cap input (between the 100k ohm resistor and the common point in your schematic), but even the leakage of the diode was enough to discharge the cap. Not only that, but the comparator's (LM339) input impedance was too low (even though it's in the M ohms) and would also discharge the cap slowly.

I even tried to buffer the cap by using a TL082 op-amp (gain of 1) since its input impedance is 10^12, but in the end, the diode's reverse leakage was too high. Because I'm charging at such a low current (0.3A), it takes a while for the delta peak to show up, and that time is long enough for the cap to discharge.

I even tried a 1000uF to lengthen the discharge time constant, but it wasn't enough. If I used a diode with extremely low leakage current, I would still need something like a 10,000uF or higher capacitor to make the discharge time long enough to be negligible. Don't forget, I am trying to sense delta peaks in the 2-4mv range over a 5-10 minute period so any current flow has to be taken into consideration. Input impedances and tiny leakage currents become a big factor. If I had "ideal" components; diodes with no leakage at all, and comparators with infinite input Z, zero output Z, and rail-to-rail outputs, then this circuit would be a piece of cake. :)

I appreciate the input though. I guess what I am trying to do needs to be done with a special IC. As such, I ordered a few samples of **broken link removed**, but not sure if I will like it. It constant-current charges up to 4 cells individually, but I don't really care for the pulse charge scheme. If I want each cell to take 0.3A charge, it will be pulsed with 1.2A since the pulse has a 25% duty cycle. But it does have several other features such as pre-charge, thermistor support (for supplemental temperature detection), Alkaline battery detection, charge top-off, and maintenance charge functions. So, I plan to give it a try and see how I like it.

I really wish there were more portable and inexpensive battery chargers on the market that used a proper charge termination scheme. All the ones I've seen either use a timed charge or are just a power supply with a current limiting resistor.
 
Last edited:
Ni-MH cells are made in Japan for Energizer (maybe by Sanyo). They are all low self-discharge now. The AA cells are 2450mAh.
I get coupons online or from a local lumber store then I buy the cells when they are on sale at a food store.

Really? I just bought 4 Energizers about a month ago. I charged them up ready for my digital camera, but when I was ready to put them in about a week ago, they had lost around 15% of their capacity.
 
Energizer cells have been low self-discharge for about 1.5 years of sales.
 
If that's true (I'm not doubting you), I'd like to know what Energizer's definition of what "low self discharge" is. I've gotten quite a few in that timeframe and before, and I don't see any difference in how they perform. I was actually using those cells in my R/C radio (Spektrum DX3r) and if the transmitter sat on the shelf for a month (in the "off" state of course), I would easily lose 0.5v over 4 cells, and that's not counting fresh-off-the-charger voltage where they tend to drift back down after a while either. The Turnigy cells I just got are lucky to lose a 5th of that amount, if that. I wonder if Energizer is just using an older generation chemistry/construction? I was reading a review of various manufacturer's LSD cells a while back and there were pretty substantial differences in self-disharge, so I guess Energizer is just not using the best cell. Maybe that's why I've never seen anything in their advertising saying they are LSD. You'd think their marketing staff would make it a selling point.
 
Last edited:
I kept the packages for the UPC symbols so I can get products from Energizer.
The package for the Ni-MH AA cells rated at 2450mAh says, "Lasts 40% longer in digital cameras than MAX alkaline cells" and "holds charge 40% longer than Ni-MH cells sold in 2005".
The package for the newest Ni-MH AA cells says, "charge 150 times more and charge lasts 6 months longer than 2450mAh cells".
 
Hmm. I do remember the "Lasts 40% longer in digital cameras than MAX alkaline cells" phrase, but not the others.
 
Status
Not open for further replies.

New Articles From Microcontroller Tips

Back
Top