Current Supply Design

Status
Not open for further replies.

jacobs_ladder

New Member
Project Takes 2 input signals. One 0 to 5V representing desired current, the other 0 to 5V representing the MAX current. The output is linear with respect to the input signal.

I've tried using an op amp voltage to current amplifier but i cannot get it to work properly.

I plan on using a comparator to selector to choose the correct signal to drive the current source. But the current source is what is giving me problems now. Any tips out there. Please feel free to help.

P.S. Driving a Diode Laser.
 
You need to provide more info.
1. Do you need a current source or a current sink?
2. What is the maximum current?
3. Is the load connected to ground or a DC voltage?
4. What is the highest voltage the output must sustain?
 
max load voltage drop is 5V max current is .2A

input range is from 0 to 5 V... the op amp circut is simple and up to .6 V it is perfect... 1V = 40 mA

this is a current source, and the load is connected through a resistor to ground. the resistor controls the amp per volt of the driver amplifier.

EDIT: added schematic
 

Attachments

  • i_to_v.JPG
    33.8 KB · Views: 534
  • schematic_152.jpg
    17.7 KB · Views: 501
Appears to me you need to add a transistor to the output. An Opamp is a voltage gain device while a transistor is a current gain device. Heres an example circuit.


**broken link removed**
 
Mr. Ladder, I'm not sure what your circuit looks like now, but the one you posted is a voltage source, not a current source. Try running at a fixed voltage for V3 and changing the load. You'll find that the current is not constant, but instead equals V3/R2.
You have the + and - pins on the op amp reversed. This usually works in simulations (I'm not sure why), but it won't work in hardware.
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…