I have a simple temperature measuring system using a LM35 temperature sensor and a PIC16F690 micro controller interfaced to a 3 digit 7 segment led display.
I currently have the senor connected to a Analog pin (via a opamp), using A2D to sample. I was wondering how I could implement a simple offset adjustment in between the sensor and the pic?
I know that one could interface a trim-pot onto another Analog channel and use software correction, but I am wondering if there exists a simple hardware solution?
Sorry, I do not have my schematics at the moment, but simply put I have a 5V supply that powers the LM35 sensor, the output is taken into a non-inverting opamp (rail-rail) configured with a gain of 8 this output is then fed into the uC.
I know I can adjust the gain at the opamp, but I want to be able to add a linear offset to correct the temperature reading.
Sorry, I do not have my schematics at the moment, but simply put I have a 5V supply that powers the LM35 sensor, the output is taken into a non-inverting opamp (rail-rail) configured with a gain of 8 this output is then fed into the uC.
I know I can adjust the gain at the opamp, but I want to be able to add a linear offset to correct the temperature reading.
Hi, Apologies for the delay, This is my schematic of my temperature sensor. I did not know where to find the LM35 in LTspice so I put a signal generator.
Also note that R2 in my circuit is made up of a 6k8Ω + 1kΩ var resistor.
Here's the circuit with an offset pot to add a negative offset voltage to the output. The pot is buffered by another op amp so the pot resistance does not affect the gain setting of the output op amp circuit.
Edit: Depending upon the offset required you may want to add a resistor in series between the +5V and the pot to reduce the pot adjustment sensitivity and improve the pot resolution.
Thanks, This solution is perfect as the opamp IC I am using is a dual opamp. I will implement this later and provide feedback.
@crutschow, I need to offset small errors, so maybe +-0.5V.
The 5V line is simply regulated with a 7805 and filter capacitor, the load is very small so I assume the voltage will remain pretty much constant. I will also join the LTSpice group as suggested.
Hi,
I implemented the above solution, and I see now how it works by offsetting the voltage level at the 1k resistor. Anyhow this does not work for me because:
1 - The rail to rail opamp is actually rail+0.1 to rail-0.1 ... so I have to have a constant offset
2 - The input signal from the LM35 is only be used in the range of 0-60degrees thus my input signal is 0 - 600mV
3 - On a 600mV signal you can see that the 100mV offset from the rail-rail opamp is quite significant.
4 - lastly I need to somehow add and subtract from the input signal around +-20mV (2degrees) for sufficient offset adjustment
Any further suggestions are welcome and I will certainly consider a completely new design.
1 - The rail to rail opamp is actually rail+0.1 to rail-0.1 ... so I have to have a constant offset
2 - The input signal from the LM35 is only be used in the range of 0-60degrees thus my input signal is 0 - 600mV
3 - On a 600mV signal you can see that the 100mV offset from the rail-rail opamp is quite significant.
4 - lastly I need to somehow add and subtract from the input signal around +-20mV (2degrees) for sufficient offset adjustment
1 - That is not an offset voltage, it is just the minimum and maximum range of the op amp.
3 - Again, that 100mV or (10mV as the case may be) is not an offset.
4 - So you are you trying to adjust the offset of the output to remove the offset error of the LM35? Why not just remove that using software? Enter the offset value into memory and subtract that value from all the readings.