Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Programmable Current Source Using LTC3623

Status
Not open for further replies.
For minimum current output, all the current of the 50uA source in the IC needs to pass through the FET. Given that the rated Vgs(thr) for the specified FET is ~1V, that means the opamp needs to drive the gate with at least 1.5V in theory. If for some reason either the opamp or the FET were out of spec that could account for the effect you are seeing. Can you try supplying the FET from 6V instead of 3.3V?

I already tried to do something similar to this (I think). Let me explain:

The MCP4726 DAC is using a reference voltage of 0.5V from the external ADR130. However, the reference voltage is software selectable. So in my troubleshooting I experimented with using the DAC supply voltage of 3.3V as the reference voltage. This had no affect on the behavior I described above. When the output of the DAC reached around 400 mV, I bottomed out at 0.9A. Going from 400 mV all the way up to a DAC output of 3.3V did nothing to decrease that 0.9A any further.

I'll admit though that when I tried this, I did not measure the op amp output connected to the gate of the MOSFET. But, does what I describe here accomplish the intent of what you're recommending. Or, do you think upping the supply voltage of the op amp is still worth a try. I assume you meant the op amp, not the FET when you wrote:

Can you try supplying the FET from 6V instead of 3.3V?

Thanks for your help.

EDIT #1: I just got home and decided to test the voltage output of the op amp connected to the gate of the MOSFET. When the output of the DAC is ~400 mV, the current output is ~0.9A, and the op amp output is around 1.2V. However, if I increase the DAC output ever so slightly, sat to 415 mV, the op amp output jumps to 3.3V, and the output current stays at 0.9A.

Does this help anyone narrow down my issue?

EDIT #2: Does anyone think lowering the value of the 10 kΩ resistor, R20 would help? I'm kind of grasping at straws for a solution.
 
Last edited:
One thing I think I neglected to mention and I'm now learning (after playing around in LT Spice) may relevant, is that the load resistance for this circuit is very low. Like close to short circuit low.

Right now I'm running all my tests with a load resistance of about 50 mΩ.

After reading Alec's post about all the 50μA having to go through the MOSFET to make the output 0A, I think what may be happening is with the load resistance so low, the IC is trying to compensate by lowering the output voltage. It's lowering the voltage so low that it's close to ground and as a result, a significant portion of that 50μA is flowing across R18 to the output, instead of flowing through the MOSFET. Does this sound plausible to anyone?

If this is in fact what's going on, logically you'd think that I could increase the value of the R18 resistor to allow more current to flow through the MOSFET. So I tried that.

First, I swapped R18 for a 220 kΩ resistor. That resulted in over 5A coming out of the circuit when the input to the op amp was 400 mV and the circuit seemed unstable.

Next, I tried a 22 kΩ resistor. That resulted in about 0.9A coming out of the circuit when the input to the op amp way 500 mV. So, the 22 kΩ resistor seemed to improve things in the sense that I was no longer bottoming out at 400 mV. But, I'm still not getting below that magical 0.9A output.

I'm going to keep experimenting with various values and combinations of R18 and R20.

Am I on the right track? Anyone have any ideas?
 
I did mean opamp, not FET. Sorry for the confusion.
I think I've solved the mystery re the 0.9A minimum and the 280mV figures. I see from the datasheet that the minimum 'on' time of the LTC3623 is, by design, 30nS. That means the minimum duty cycle, with a 1uS period, is ~3% rather than the 0% you would need to get zero load current. The following simplified sim of the output of your circuit bears out both figures.
ControlledCurrent.JPG
Reducing the PWM frequency should reduce the output current minimum.
 
Last edited:
I did mean opamp, not FET. Sorry for the confusion.
I think I've solved the mystery re the 0.9A minimum and the 280mV figures. I see from the datasheet that the minimum 'on' time of the LTC3623 is, by design, 30nS. That means the minimum duty cycle, with a 1uS period, is ~3% rather than the 0% you would need to get zero load current. The following simplified sim of the output of your circuit bears out both figures.
Reducing the PWM frequency should reduce the output current minimum.

That makes sense. Although, the behavior I observed was that the current bottoms out at 0.9A when the input is around 400 mV, not 280mV. But, that could probably be explained away by minor differences in component values and real-world effects not captured by the sim. I hope so anyway.

So, if I go the route of decreasing the PWM frequency, I imagine that would adversely affect the circuit's transient response. Do you concur? I'm not sure that's a bad thing for my application but I'm just trying to think through everything. Are there any other negative effects you can foresee?

I couldn't go too much lower on the PWM frequency. In my circuit it's fixed at 1 MHz and the minimum per the data sheet 400 kHz. If I went all the way down to 400 kHz, that would drop the duty cycle to ~1.2%. That probably still won't get me down to the 90 mA my application requires.

Another workaround I'm thinking of would be to place a MOSFET and a 1Ω resistor in parallel at the output between my circuit and the load. I would choose a MOSFET with a low RDS ON and have it conducting most of the time effectively shorting past the 1Ω resistor. When the current requirement drops below 1A or so, I'll remove the command from the MOSFET and force the output of my circuit to pass through the 1Ω resistor effectively increasing the load resistance by 1Ω.

I think it'll work for my application because the current requirement is something that will change but it'll be very gradual and I shouldn't get into any cases in which I'm oscillating between having the MOSFET on and off. Do you see any red flags with that idea?

If I didn't do a good job of explaining it, let me know and I'll post a schematic.

Thanks for your help!

EDIT: Also, I forgot to ask. What are your thoughts on decreasing the size of the output capacitor?
 
Last edited:
I imagine that would adversely affect the circuit's transient response.
Seems likely.
The FET/1Ohm should work. Don't know what the effect of changing the output cap on stability might be.

Edit: Sim shows that switching the FET/1Ohm on/off gives a significant transient spike in the ouput current.
 
Last edited:
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top