Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Mppt charge controller schematic assistance

Status
Not open for further replies.

barg

New Member
Hi Everybody,

I am dealing with a project to build a mppt charge controller using ne555 or pic based mcu depends on efficiency/complexity…

Charging details are lifepo4 4s 4A (cut of charging voltage @ 14.4v), SP 18v, 1A imp. main issue here is to protect from over voltage (no more than exactly 14.4v) and ofcourse to start chrg at 10v exactly.

I need your assistance to find a relevant schematic that works / tested as an efficient one.

Thank you!
 
Have you calculated how much faster you can charge your battery using MPPT vs just using the panel as a current source to charge the battery? How much time do you think you can save?
 
Hi Mike,

chrge rate can be up to 2A, i am realy at this point to chose a pwm or mppt but i lack of information of how to do the lose calculation between this two methodes for my spec as above, than to see if the lose with pwm is too much and how much A/Efficiency i will earn if to go with mppt...???
 
The most you can gain with MPPT is about 25%; hardly worth the trouble, if you asked me... If it matters, it is much easier to but a slightly larger panel, especially since panels are getting cheaper by the day... Besides, the MPPT voltage for an 18V panel is very close to 14.4V...
 
Thanks Mike,

Thats mean the ODDS go with pwm for my case?

In addition, please advice what is the meaning of - 18v panel is very close to 14.4v? is the charge is going to be affected by that small gapp if i will apply it with pwm?
 
Last edited:
Every panel has a characteristic output Current vs output Voltage curve. The Maximum Power Point on that curve is where the panel delivers its maximum possible output power (I*V) to a load. If the panel's loaded voltage can be controlled to an optimum voltage (instead of that voltage being a function of the load), the panel can deliver a bit more power than it otherwise could.

This requires an "impedance converter"; convert the actual load impedance to the panel's optimum MPP impedance. Usually, a special Switch Mode Power Supply is used as the impedance converter. Its feedback keeps the input side at a panel current and panel voltage that maximizes the panel power, while keeping the voltage at whatever the load requires.

It just so happens that the voltage at which your panel develops its maximum power will be somewhere between 12 and 14V, which is the charging voltage of your battery. Just connect the panel to the battery, and switch off the panel when you determine the battery is charged. No SMPS required (or PWM) ..., just a PFET used as an on/off switch would do it.


What is the charging algorithm for your battery?
 
Last edited:
Hi,

I have to agree that a switching power converter is required in order to draw the array down to it's optimal operating point while supplying that power to the battery for charging, until the battery gets close to fully charged and then it would cut back the power anyway. The switcher would look like a voltage limited current source to the battery, adjusting the current level as needed to keep the panel at optimum. Once the battery is fully charged (if that ever happens) then the control would cut back the current and so the array would no longer be running at the optimum operating point.

In the applications i dealt with in the past, the switcher would be pumping current back into the mains line (not a battery) in order to sell the power back to the power company.
 
when connecting directly the panel to the battery, indeed it shows the battery voltage which it was 13.15v at the mesurement time.

how it is possible to mesure the panel current at load if it drains all the current?

i just need to clarify, you are suggesting not to use smps in my case (20w sp to 14v 4a battery) only to stop from over voltage without any pwm smps?
 
Hi Al,

I also agree with your controll point smps benifits, even it is a small system, but getting difficult to calculate the relevant minimum sp load current if i am having only the impp in open circuit and the load i need is a 0.5AH. which in my case 18v vmpp, 1A impp.

even if the manufacturer will give this details, i am not sure i can build on this lab data...
 
when connecting directly the panel to the battery, indeed it shows the battery voltage which it was 13.15v at the mesurement time.

how it is possible to mesure the panel current at load if it drains all the current?

i just need to clarify, you are suggesting not to use smps in my case (20w sp to 14v 4a battery) only to stop from over voltage without any pwm smps?

Yes, all available panel current is going into the battery. As the battery charges, its voltage goes from 10 to 14V? If so, there is no need for an SMPS at all. Just let all avail panel current go to charging the battery. The panel voltage is determined solely by the battery voltage.

Let the battery voltage determine when to stop charging. It seems to me that you do not need to measure panel current as part of the control scheme. If you do, write back and I will suggest a current-monitor circuit.
 
Hi Mike,

Actualy i understand your point bacause of the "small gapp" between the 15w/ 20w panel to the batt v but, i need the smps to stop the charging exactly at 14.4v as this li ion battery is verry sensitive and i need it to work 100% automaticaly, so do you have other solid solution to control the cutoff than smps? which will be more prefered by me.

In addition, can you please explain a thumb rule how to determine the proper sp power to get 0.5AH? is 15w (0.8A imp) is enough?

Thanks
 
If your battery capacity is 0.5Ah, then the panel must deliver 500mA for 1 hour, or 100mA for 5 hours, or 1.0A for 30min. Since no battery is 100% efficient, then you must actually put in about 120% more...

What you need a LiIon battery charge controller; not necessarily a SMPS, and it does not have know anything about MPPT!
 
Battery capacity is 4A, my intention is to charge it with the panel in 0.5Ah - the panel gives 18v vmpp 0.83A impp - open circuit, my question is, if there is a way to know what is the max load this panel can give theoretically or practically in 14.4v?

As to controller, i will check this liion issue.

thanks
 
Hi,

Since you say that this is an Li-ion battery that brings in a set of special considerations. There is a lot written on the web about this so you could do more reading too. I will mention the main points that every system designer should know and take into consideration.

1. One is the terminal voltage, which must be controlled precisely, and you already know about that one, which is good.
2. The second is current limiting, which must be found from the data sheet.
3. The third is the low current cutout point, which many 'believe' is a safety factor because of possibly plating of the electrodes which destroys the battery over time, shortening it's life.
4. The fourth is a timer to make sure there is no way the battery can be charged indefinitely.
5. Charge balancing.
6. Temperature sensing (optional).

[1] For #1, we want a circuit that can detect the voltage and either turn off the charge current or reduce it in order to keep the voltage from going too high. There are two basic approaches: Abrupt and gradual.
With a purely linear charger, the current is cut back little by little so the voltage never gets too high, and then when the current gets to a certain low level like 5 percent of normal charge current, the charging is shut off completely and never allowed to turn back on without user intervention, or alternately until some low level voltage has been reached (like 3.5 volts).
The above is considered a very good method.
If the voltage is to be simply cut off though, then we need to think about having a pretty accurate set point and turn off the charge controller and not allow it to come back on until the battery has reached some low level again like 3.5v. If this happens too fast though, the timer would prevent charging anyway, and thus prevent a faulty cell from being constantly charged.

[2] Current limiting is not too difficult, the current has to be monitored and the controller forced to cut back the current level if it goes too high. If the source can never supply the normal charge current however then it may already be current limited. If the panel can not supply say 1 amp when the normal charge current is 1 amp, then we might consider that already limited.

[3] As the battery gets close to full charge, the current can get low, and very low current is not considered good for an Li-ion because it is not enough to reach the point where the charge acceptance is high enough to actually charge the battery. Instead it causes plating, which ruins the battery. So some low level cutout is also a good idea. I think the consensus here is that 5 percent of the normal charge rate is the right cutout point. So if normal charging is 1 amp then the low level cutout is 50ma. Anything less than that might be considered too low. Personally i think it can be lower, like 25ma, but i dont want to interject any personal views too strongly here. The way i charge my cells is i never leave them too long anyway, but that requires human intervention something that a completely automatic system doesnt have.

[4] The timer is the last safeguard. This prevents charging for too long if something else goes wrong, which might include a significantly aged cell since it may never get to the fully charged state.

If you charger design takes into account all four (1 through 4) of these things, you 'll have a very good system which would be safe. You can add temperature sensing to that, but that's not always done. Temperature sensing would be done in an environment where it would be extremely hazardous to have something get too warm, or if you want an extra built in measure of safety.

[5] If your battery includes multiple cells in series, then you have to also think about charge balancing. Since they dont make a 14.4v cell, there must be some cells in series for this pack. That means they probably have a built in charge balance mechanism but you could check that. It could be made from circuits that are internal to the pack that bypass certain cells that are considered charged already.

Now a quick note about the methods i have used in the past with great success.

Linear or pseudo linear charging with gradual current cutback, with second isolated circuit to monitor charging. This includes secondary voltage and current measurements to detect any fault of the primary charge system.
I've also used a purely linear circuit with gradual current cutback, using a simple LM317 circuit which i can post here if needed. A few parts does the trick, but we'd add low level current cutoff.

BTW, the battery never 'uses' all the current. The current flows through the battery, but the current can still be measured using one of many types of circuits to measure current just like a multimeter would do. So the current does not get 'used up' or anything like that.
 
Last edited:
Good post...

...
BTW, the battery never 'uses' all the current. The current flows through the battery, but the current can still be measured using one of many types of circuits to measure current just like a multimeter would do. So the current does not get 'used up' or anything like that.

A better way of saying this is if the panel is connected to the battery early in its charge cycle (before the battery voltage reaches the set point of the voltage regulator), then all the available panel current will flow into the battery. Only after the battery reaches the voltage limit will the panel current drop.
 
Good post...



A better way of saying this is if the panel is connected to the battery early in its charge cycle (before the battery voltage reaches the set point of the voltage regulator), then all the available panel current will flow into the battery. Only after the battery reaches the voltage limit will the panel current drop.

Hi there Mike,

Surely you dont want to rely on the current dropping enough all by itself in order to limit the voltage to the required strict limit of 14.4 volts do you? That would probably be a mistake because if the cells can charge even a little with the reduced current then the voltage will rise and rise and rise. There has to be a control law, and that control law must include some specification that includes the voltage limit set point. In this case it appears to be 14.4v but in all my chargers it was always 4.200 volts (single cell). That's a very strict requirement for the Li-ion type cells and i would never build a charger that worked off of any power source without some definite mechanism to limit the voltage once it reached or got close to the set point max.
Even if it looked like the panel was going to limit the voltage at 14.4 (or whatever) i would still not rely on that alone because this is really a variable with any array. Remember arrays are just diodes with light exposed dies that has a certain characteristic voltage, but that voltage level is not set in stone. There's always some variation due to temperature and other factors.
If the max cell voltage was not so darn strict i would not care one bit, but since this has to be carefully planned out in any charger design to keep the cell happy and safe, i would always have to impose this kind of constraint.

As i write this though i am now wondering where that spec of 14.4 volts came from. If we have one cell it is 4.2v, two cells is then 8.4v, three cells is 12.6, and one more cell in series would bring us up to 16.8 volts, so where is this 14.4v coming from?
That sounds either under or over specified. 14.4 divided by 3 is 4.8 which sounds too high, and 14.4 divided by 4 is 3.6, which sounds too low. Cells are sometimes spec'd at a lower voltage like that though, so maybe this is really a 16.8v battery pack.
 
Here is the way I think of this problem:

328.gif


The panel can be modeled as simple current source. It puts out a nominal 1A at any output voltage between 10V and 14V. You can quibble about the details, but this is a pretty good first-order model.

The battery can be modeled as a simple capacitor that integrates the current into it, and its voltage increases as it does. You can quibble about the details, but this is a pretty good first-order model.

The goal is to stop the battery voltage increasing when the battery voltage reaches a pre-determined limit. A simple shunt-regulator will do than, in the this case a Zener.

The time-domain simulation shows what happens when the "battery" is pre-charged to 10V, and the sun begins to shine. Initially, the full 1A flows into the battery, and none flows in the shunt regulator. When the battery reaches the Zener clamp voltage, the current into the battery decreases as the panel current is diverted away from the battery by the shunt regulator.

Your concern about the battery current going to zero asymptotically can be addressed by monitoring the current into the battery, and isolating it from the panel when the battery current drops below a threshold...
 
Hi Mr Al,


Well, thanks for your detailed infightings notes.


I think I might use a pwm ready circuit I found in the net that principally works like that: it charges the battery freely up to 14.4v and than stays there for a while and then starts to modulate until it gates down to 13.5v level and then stays there until charge is stops=dusk or with no time limit. That’s covers points 1-2 as you noted.


As for point 4 timer, I am not sure this circuit has internal clock even it looks a neat solution and I don’t know how to add one i'l be glad to learn from you… basically I am looking for a simple addition if you know any?


As for temp sensing, I will check this option whether it is within the second bttr ptc. and for balancing, I prefer pack instead of balancing, great that you floated this option also.


As for point 3, low current cutoff point, please advice how it can simply done? Can you show any relevant scheme for this one?


As your lm317 solution, my concern is not to waste a power in linear charging but to use switch pwm as I noted, do you think it is a right point in a matter of efficiency with my spec as above?
 
...

As your lm317 solution, my concern is not to waste a power in linear charging but to use switch pwm as I noted, do you think it is a right point in a matter of efficiency with my spec as above?
No power is lost with the scheme I showed in post #17. Only after the battery reaches its maximum-allowed-during-charge-Voltage is any available current diverted away from charging the battery. Prior to the battery reaching that voltage, 100% of the available solar current goes into the battery.
 
Status
Not open for further replies.

Latest threads

Back
Top