I know that current can be measured with an opamp (configured as a current-to-voltage converter) so that the output voltage is proportional to the input current. It is easy to find theoretical information about this, but I wonder how it works in a practical situation.
Here is a theoretical drawing from wikipedia:
**broken link removed**
The idea is that I want to make a digital precision ammeter that will measure current in the 0-100 mA range. I suppose that means a suitable current-to-voltage converter and an ADC. I have worked with the latter before and I don't have any question about ADC or DAC's - right now it is all about the current-to-voltage converter.
Please don't recommend using precision shunt resistors instead - I don't know where to get them and they are probably very expensive. If you insist on using shunt resistors, please read my questions anyway and tell me why
So my questions is:
1. How do I connect the opamp to measure current? I suppose it has to be connected in series like an "analog" ammeter, but is it across the inverting input and output or across the inverting input and ground?
2. What parameter in opamp datasheets states the maximum allowed input current?
3. Do I need power opamps or can I divide the input current across several opamps and sum the output voltage?
4. What about the voltage in the circuit being measured? Example: Measuring current flow through a BJT with VCC=15 volts and the ammeter/current-to-voltage converter is connected to supply (VCC) and the collector of the transistor. Does the VCC have to be within the limits of the opamp's supply?
5. Can I calibrate the current-to-voltage converter with a constant current source or sink? Like an IC precision voltage reference can be used to calibrate a voltmeter? If yes - how? Is there any cheap or reasonable priced IC precision current sources/sinks available? I'm asking this question because I probably want it to be sure that 100 mA => 10 V output, 10 mA => 1 V output or something like that.
Those questions have been on my mind for quite some time. I really hope that I can get some answers on this forum - any help/advice is appreciated
Here is a theoretical drawing from wikipedia:
**broken link removed**
The idea is that I want to make a digital precision ammeter that will measure current in the 0-100 mA range. I suppose that means a suitable current-to-voltage converter and an ADC. I have worked with the latter before and I don't have any question about ADC or DAC's - right now it is all about the current-to-voltage converter.
Please don't recommend using precision shunt resistors instead - I don't know where to get them and they are probably very expensive. If you insist on using shunt resistors, please read my questions anyway and tell me why
So my questions is:
1. How do I connect the opamp to measure current? I suppose it has to be connected in series like an "analog" ammeter, but is it across the inverting input and output or across the inverting input and ground?
2. What parameter in opamp datasheets states the maximum allowed input current?
3. Do I need power opamps or can I divide the input current across several opamps and sum the output voltage?
4. What about the voltage in the circuit being measured? Example: Measuring current flow through a BJT with VCC=15 volts and the ammeter/current-to-voltage converter is connected to supply (VCC) and the collector of the transistor. Does the VCC have to be within the limits of the opamp's supply?
5. Can I calibrate the current-to-voltage converter with a constant current source or sink? Like an IC precision voltage reference can be used to calibrate a voltmeter? If yes - how? Is there any cheap or reasonable priced IC precision current sources/sinks available? I'm asking this question because I probably want it to be sure that 100 mA => 10 V output, 10 mA => 1 V output or something like that.
Those questions have been on my mind for quite some time. I really hope that I can get some answers on this forum - any help/advice is appreciated