Hello again
I have added a schematic of my design here.
There are several blocks,of which i have shown two blocks.
One is the CC source as mentioned by Roff (although i am yet to decide on the op-amp part in place of LT1677) and the subsequent is the differential instrumentation amplifier using TI TLC4502 (the parts which i have).
My intention: The RL load is a resistor having a temp.coeff (50 to 180ppm/C)
Initially i have to calibrate the circuit.For this I will pass a set current for a very short duration of 10ms.I will get the differential voltage by passing it through the diff. amp.The samples will be averaged in an ADC and stored.The stored value is then converted to an analog value with a DAC (not shown in scheme)
Now during actual test, the current will be the same as used in calibration but with larger time (say 500ms).The stored analog value of calibration is set as Vref (not shown in scheme) through the DAC to another stage of op-amp (not shown).Due to larger time for which the current flows ,the temperature of the Rload and thus Resistance changes so voltage drop also changes (above the calibrated value).
This change in drop is in microvolts.Which needs to be further amplified and measured with a 12 bit ADC for 2k samples.
I have used the application circuit mentioned in the
TLC4502 datasheet Page23,Fig.37
The diff.amp. in schematic has to have a gain(A) of 1 since i only need to get the differential voltage across the Rload (RL) and further amplify the difference between the set calibrated value and the time modified value (as mentioned above) to maybe A=1000.
If you check the formula for Vo then for A=1 i get nonsense values for the resistors (i only need Vo=VRL)
What is wrong?
Am i doing something wrong here?
Sorry for the loooooong post
Please get back
Thanks and regards