Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Pratical op amp design - voltage and current sense

Status
Not open for further replies.
Hello electronic experts! I need some help designing an accurate, easy to calibrate voltage and current sensor. It will be used for measuring the power output of a solar panel. The ADC I'm using measures 0 to 3.3 volts. The max output voltage of the solar panel is 14 volts and I will use a high precision resistor network (voltage divider) to condition the signal for the ADC. Here is the problem... I want to measure the current of the panel but not drain too much power from the system by using a big resistor. The two approaches I'm working on are: 1) placing a very low resistance instrument resistor of value .005 ohms in series with the panel and then measuring the small voltage drop, amplifing the signal (0 to 3.4mv) using an opamp to obtain the required 0-3.3 volt level. 2) using a circuit current clamp. I have tested a non-inverting op amp with a gain of about 880 but am wondering how to design the circuit for max accuracy any stability. Do I use high or low resistance values (yes, .1% is a given) for the feedback loop. How do I design a good offset circuit to calibrate the opamp to produce 0 volts when 0 volts are input. What if any caps should I use. I'm bacially looking for practical tips for amp design, not just theory. If anyone has actual tested circuits that would be a great help. I'll take any hints or tricks you all have. Thanks for the help!

Frank
 
Justanotherproject said:
Hello electronic experts! I need some help designing an accurate, easy to calibrate voltage and current sensor. It will be used for measuring the power output of a solar panel. The ADC I'm using measures 0 to 3.3 volts. The max output voltage of the solar panel is 14 volts and I will use a high precision resistor network (voltage divider) to condition the signal for the ADC. Here is the problem... I want to measure the current of the panel but not drain too much power from the system by using a big resistor. The two approaches I'm working on are: 1) placing a very low resistance instrument resistor of value .005 ohms in series with the panel and then measuring the small voltage drop, amplifing the signal (0 to 3.4mv) using an opamp to obtain the required 0-3.3 volt level. 2) using a circuit current clamp. I have tested a non-inverting op amp with a gain of about 880 but am wondering how to design the circuit for max accuracy any stability. Do I use high or low resistance values (yes, .1% is a given) for the feedback loop. How do I design a good offset circuit to calibrate the opamp to produce 0 volts when 0 volts are input. What if any caps should I use. I'm bacially looking for practical tips for amp design, not just theory. If anyone has actual tested circuits that would be a great help. I'll take any hints or tricks you all have. Thanks for the help!

Frank

Might I suggest a few things:

For the current sensing - A low series R is a good way to go. You can have a low power current sense circuit quite easily. The method you described sounds reasonable.

For the voltage sensing - Do not use a resistor network directly on the 14V Because this will load it down (Steal current) Even if the resistors used are quite large. Instead, Feed the voltage to an op-amp. Maybe an instrument op amp or diff-amp. IF the op amps used are FET input types, you will not load the 14V at all.

Your accuracies will be determined by the tolerances, offsets and drifts of your opamps and resistors. Some may be calibrated out but some may not (like long term drifts of resistors) or you will just have to re-cal periodically.
 
Thanks for the input. Are you saying that I should first buffer the panel voltage though an op amp and then use a resistor network to reduce the maximum voltage? Don't I have to put those resistors in after the op amp anyway? Is there a way to make an op amp with a gain of less than 1? Please explain this further if you have the time. Also I calculated the power loss of using a 10K ohm resistor network and that comes out to a loss of only 19 mw. But hey, I'm trying to keep as much power as I can generate! Thanks!
-Frank
 
solar panel measurements

If you use high value resistors as a voltage divider then an op amp as a voltage follower to monitor the voltage it should work. If you want the monitor voltage to be 1/10 the voltage of the solar panels then you would use a 9 to 1 ratio of resistors in series.
Using a low value resistor as a current sense will work. I would use the op amp set up as a differential amplifer to indicate the voltage drop across the sense resistor. I have tried various op amps to achieve this, I was trying to achieve this with a single ended op amp(one that worked from a single supply)never did have much sucess. As soon as I went to an op amp with + and - supplies it worked great and the differential amplifier even had lots of gain. I used one of the voltage pump IC's to get the - supply voltage. Maxim or Linear Technology has a IC specifically for that purpose, has an internal resistor and a op amp. As I recall the output is 1mv per amp.
Unfortunately in all my experiments I did not require the accuracy you are
looking for.
 
Justanotherproject said:
Thanks for the input. Are you saying that I should first buffer the panel voltage though an op amp and then use a resistor network to reduce the maximum voltage? Don't I have to put those resistors in after the op amp anyway? Is there a way to make an op amp with a gain of less than 1? Please explain this further if you have the time. Also I calculated the power loss of using a 10K ohm resistor network and that comes out to a loss of only 19 mw. But hey, I'm trying to keep as much power as I can generate! Thanks!
-Frank

Well. I am saying that you can use the resistor idea as you originally described for measuring the voltage if you consider a few things. Sure it will work BUT:

If you want to draw as little as possible from your solar panel then you must make these resistors very large. Very large resistors are noisier and unless you get high quality devices ($$$) will vary quite a bit with temperature changes (even small ones) and this by itself will add error to your voltage measurement. You meantioned accuracy but you have not quantified that. So I am suggesting ways to get as good of accuracy as you can while spending very little $$$ on components.

So you could make a diff-amp with two op-amps that are FET input that will behave as if you had Giga-Ohm resistors there. And you can get a low cost dual JFET amp for 36 cents. And the opamps will not vary as much with time / temperature as the resistors would.


What are your power supplies available? Were you planning to use the solar output to run your circuits as well? If not, what do you have available for supplies?
 
Ah, I see why you would use the op amps before the resistor network (extremely hight input resistance with low temperature drift and lower noise). I'll be trying a few circuits this week. I may be back for more help. Thanks again,

Frank
 
Hi Frank, thanks for posting this question! I'm wrestling with a similar challenge myself.

Optikon wrote:
"What are your power supplies available? Were you planning to use the solar output to run your circuits as well? If not, what do you have available for supplies?"

Suppose you have a +3.3V supply voltage (independent from the solar panel) that you want to use for your microprocessor as well as for signal conditioning circuitry between panel and microprocessor ADC. Further suppose you want to be able to turn off the signal conditioning circuitry to save power, without damaging any parts.

What would you recommend? Any advice would be thoroughly appreciated.

Thanks,

-K
 
You will not find an op-amp that will output 3.3v from a 3.3v supply, you'll only get close. Likewise if you only have a single supply, you will only get close to 0v output. 5v positive regulators are cheap and easy to use. Use an ICL7660 or similar to obtain the negative rail. Instead of using 0.1% resistors, use 1% metal films with a 20 turn cermet trimpot in strategic locations.
 
Last edited:
This thread is 2004 vintage, I guess the OP has solved his problem.

A MCP6001/2 is a rail2rail OPA that will operate from +1.8V to +5.5V
Although the rail2rail only means close to +V and 0V output swing within a few millivolts, this is usually good enough for most projects.
 
Thanks a lot for the response!

I'm actually not so much worried about finding an op amp that meshes well with the microcontroller's ADC as I am puzzled as to how to implement a signal conditioning circuit for voltage measurement in a way that either: 1)draws <5 micro-amps in use or 2)can be "turned off" by the microprocessor, in order to draw <5 microamps when no measurement is taking place. If you have any low-power recommendations related to reading a high voltage signal with a low-voltage ADC - I'm all ears! (Thanks for the hall effect sensor idea for measuring current using low-power circuitry; I'll look into this.)
 
A 15v voltage divider that only uses 5ua would require ginormous ohmage. Either use a small relay (mercury wetted reed preferred) or a low ohm solid state switch and sample the voltage with a voltage divider that is not overly affected by the op-amps input resistance. Whatever op-amp you chose, it WILL affect the voltage divider, its just a question of how much. 0.1% devices can be a real challenge especially when you don't have a test instrument that can verify your results.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top