Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

how to design a buck converter(step down converter) with variable output voltage?

Status
Not open for further replies.

spikeeeee

New Member
Am planning to design a buck converter to charge some 3000F ultracapacitor at 40A. This is the datasheet for the ultracapacitor. https://www.electro-tech-online.com/custompdfs/2013/07/datasheet_k2_series_1015370.pdf . I got 30 of them which will result in a equivalent capacitance of 100F at 75V with a total esr of 8.7mohms(which is very low) . Am planning to use it in a electric car application. These ultracapacitor when charged using a battery or some other source it will act as a short circuit across its charging source. so to charge these capacitors I should develop a buck converter circuit in which its output voltage will vary from 1 to 75V. The reason for variable output voltage is that when the capacitor voltage is at 0V i would charge this capacitor at 1V.

R=V/I ; 1/40 =0.025

To have 40A of charging current the equivalent series resistance of the capacitor must be 0.025ohm . hence i will add a resistor of (0.025-0.0087=0.0163) 16.3mohm in series to the capacitor to allow only a maximum current flow of 40A.

My confusion why there are not many designs of buck converter with variable output voltage, what are the drawbacks in designing such a variable output voltage buck converter.Can it be designed??. I just wanted this charging circuit to be as efficient as possible with very little power stage during transfer cost of the circuit is not a concern since the circuit is designed mostly for very high effiiciency.

Kindly suggest me if there are other ways to charge in a more efficient way

thank you in advance. :)
 
It would likely be better to use a constant current buck regulator. But designs for a 40A constant-current regulator are not common. A simple hysteretic regulator, configured to deliver a constant current, might be a good choice.
 
Hello there,


The difference between a regular buck circuit and a current regulated buck circuit is that the regular buck measures the output voltage and uses that measurement as feedback while the current regulated buck circuit measures the current and uses that as feedback. Using the output current measurement as feedback the output current is regulated instead of the voltage, with the max voltage being defined by the voltage setting and regulated by measurement of the voltage. So the current regulator measures both current and voltage and feeds them both back to the control stage. The control stage decides which type of control will be used at any given time, so that for example while the capacitor is charging current control would be in place and when the cap is changed voltage control would be in place.

The current control feedback should be fast though, so active elements should be transistors not op amps unless you are willing to provide a compensation network. And in most cases the current control does not need to be very accurate because usually a range of currents is acceptable. In your case a current of 40 amps translates to 35 to 45 amps most likely, or something like that, but usually you can get it much more accurate than that without too much trouble.

So the overall design ends up looking like a lead acid battery charger, where we have the current control limiting the charge current and then eventually the voltage control takes over and limits the voltage.

One of the simplest forms of current control involves a single transistor that measures the current and boosts the voltage feedback signal so that it effectively takes over the control of the circuit. The drawback is that we need to drop around 0.6v for the measurement, and at 40 amps this means wasting 24 watts which may or may not be acceptable.

40 amps is not considered a light current though and inductors can get expensive at these currents, so a relatively high frequency would probably be used for the switching frequency to keep inductor size small.
 
Hello there,


The difference between a regular buck circuit and a current regulated buck circuit is that the regular buck measures the output voltage and uses that measurement as feedback while the current regulated buck circuit measures the current and uses that as feedback. Using the output current measurement as feedback the output current is regulated instead of the voltage, with the max voltage being defined by the voltage setting and regulated by measurement of the voltage. So the current regulator measures both current and voltage and feeds them both back to the control stage. The control stage decides which type of control will be used at any given time, so that for example while the capacitor is charging current control would be in place and when the cap is changed voltage control would be in place.

The current control feedback should be fast though, so active elements should be transistors not op amps unless you are willing to provide a compensation network. And in most cases the current control does not need to be very accurate because usually a range of currents is acceptable. In your case a current of 40 amps translates to 35 to 45 amps most likely, or something like that, but usually you can get it much more accurate than that without too much trouble.

So the overall design ends up looking like a lead acid battery charger, where we have the current control limiting the charge current and then eventually the voltage control takes over and limits the voltage.

One of the simplest forms of current control involves a single transistor that measures the current and boosts the voltage feedback signal so that it effectively takes over the control of the circuit. The drawback is that we need to drop around 0.6v for the measurement, and at 40 amps this means wasting 24 watts which may or may not be acceptable.

40 amps is not considered a light current though and inductors can get expensive at these currents, so a relatively high frequency would probably be used for the switching frequency to keep inductor size small.

thank you for your reply Mr Al . I have a doubt if i want to charge a capacitor at 40A which is initially at 0V i will have to apply 1V right. even if it is curent mode mode controlled also wouldn't the converter indirectly regulate the voltage to 1V. hence I thought instead of controlling the current i thought regulating the voltage would be a easier one. kindly correct me if am wrong.
 
Generally it's just as easy to control the current output as it is the voltage. You just add a small resistor in series with the load to measure the current and use the voltage across that resistor to control the converter.

I suggested a hysteretic converter since they generally are simpler as they require little or no compensation in the feedback loop.
 
Last edited:
There is a 3 transistor buck regulator here that is both current and voltage regulated;
https://romanblack.com/smps/a04.htm

With those parts values it is limited to <1 amp, but the basic principle could be scaled up (or might give you some ideas).
 
Where is your charging current coming from. That makes a difference.
Crutschow is right about the regulator. It will have two feed back loops. One to hold the current at less than 20 amps and the other at less than 75 volts.

If you want a more complicated power supply I can make one that charges at:
20A at 75V
40A at 40V
80A at 20V
100A below 15V
It will charge faster.
 
Last edited:
thank you for your reply Mr Al . I have a doubt if i want to charge a capacitor at 40A which is initially at 0V i will have to apply 1V right. even if it is curent mode mode controlled also wouldn't the converter indirectly regulate the voltage to 1V. hence I thought instead of controlling the current i thought regulating the voltage would be a easier one. kindly correct me if am wrong.

Hi,

I was under the impression that the current the capacitor could take was 40 amps to charge or discharge. If that is not true, then you have to find out what it really is because that will affect the design. If the cap can only take 10 amps for example, then all you have to do is design a buck circuit that can put out 10 amps and then add current regulation.

Adding current regulation to a lower power regulator like this is easy. Depending on the type of voltage control, either the voltage control pin gets pulled higher or gets pulled lower as the current is measured. For example, for a buck chip that has a 2.5v reference as the output voltage goes above 2.5v the chip starts to cut back the output pulse width, which lowers the output power. So to get this to regulate current a circuit that measured current then pulls the voltage up higher when the current goes up too high, and that lowers the output power too and so the current gets regulated.
For a chip like the LM317, the ADJ pin is pulled lower instead of higher. This decreases the output and thus lowers the current. And this can be done using a single transistor with collector tied to the ADJ pin, and current from the output is made to flow through a current sense resistor that is in parallel with the base emitter. As the current increases eventually the transistor collector starts to pull down the ADJ pin voltage and that lowers the output current. So in this case it is a simple transistor and a couple resistors.

But the main idea is to move to current regulation with voltage regulation on top of that. It's never about pure voltage regulation, where you regulate to 1v or 2v when the cap is at 0v, and 4v or 5v when the cap is at 1v, etc. It is about regulating the current itself.
However, if you dont mind wasting a bit of power you can use a resistor in series with the capacitor and regulate the voltage on the other side of the resistor based on what the capacitor voltage is. I think this is what you had in mind. In this way you only have to regulate voltage, but not the voltage on the cap itself. What this means however is that we basically are regulating the voltage across the resistor, and when we regulate that voltage that means we are really just regulating current because I=E/R, and R is fixed. And what this also means is we end up wasting more power because P=I*I*R, R fixed. So this is not the most efficient way to do it even if you use a switching buck circuit unless the resistor value can be kept small, but then we have a current regulator again not a voltage regulator :)

And when you have a capacitor with 0v across it you do not apply 1v to charge it. That's because to apply 1v across a capacitor in zero time requires an infinite current to flow, which would blow the capacitor :)
What you do is apply a current of maybe 1 amps or 10 amps or whatever it will take without damage, and wait for the voltage to rise. Once the voltage rises to say 50v, you then turn to voltage regulation, and that keeps the voltage at 50v until it is discharged. Also, because of the current regulation, you do not need to disconnect the charge circuit when discharging if you dont want to because the current will be limited anyway.
 
Looking at the capacitors data sheet you can charge at 130 amps.
Look at my last post. The charging current I said equals to 1500 watts or about all you can should get from a simple 110VAC outlet.

Putting 130Amps into a 0volt capacitor takes about 0 watt. 130X0=0 That is why I would charge with a power supply that regulates at 1500 watt input power. (regulate at 1500 watts input, OR 100Amps out, OR 75 volts output)
 
Hi there ron,

Well, 1500 watts input is 1500 amps output at 1 volt output, and 1500 watts input is 3000 amps output at 0.5 volt, so what is your plan? It would take quite a set of components to get 1500 amps out right? Maybe 15 welders in parallel? :)
 
Post #7 and #9: Limit current to 100 to 130 amps.
What I am trying to say is that it does not make good use of time and power to put 20A @ 0V (0watts) into a cap. OR to put 20A @ 1V into the cap (20W) when the limiting items are the cap (130 amps) and/or the source (1500 watts).
 
Hi
I need to design variable output buck converter (0 to 24) with voltage input of 24, using icl7667 as mosfet driver.
Is it necessary to have a feedback?
 
Hi mshh,
You have hijacked someone else's ancient thread. Much better to start your own.
 
Status
Not open for further replies.

New Articles From Microcontroller Tips

Back
Top