I disagree. The DC input leakage spec on the AD input pin is +-1uA. The error caused by a 1uA change in input leakage current using the 200k/100K resistor divider is about 1% (about the same error as using 1% resistors), besides, the leakage current will me more or less constant, so it causes an initial offset.
The requirement that the A/D input is fed from a voltage source with a source impedance of <10K is met by the 10uF capacitor. The settling time is long, but since the OP is measuring a battery, who cares.
Besides, I have done this, and it works just fine...
I agree with Mike that the source impedance doesn't need to comply with the less than 10k requirement due to the capacitor. The low source impedance is required so the internal sample and hold capacitor will charge quickly. With the 10uF cap (I think 1uF is more than enough) then the internal cap will charge almost instantly. However, I would suggest dropping the values to 50k and 22k and making sure you have a low leakage capacitor.
Mike.
Suraj, 67K comes from 1/(1/200+1/100). I.E. series resistors.
Hi all
I have a lead acid battery charger. It’s a simple charger.
What I want to do is measure its battery voltage from a PIC. The problem is the output voltage is unsmoothed DC (there is no any capacitor).
Will this be a problem when sampling? What are the precautions do I have to make?
Hello again,
Thinking that the capacitor makes up a high source impedance because
it has low impedance is a common misconception with ADC inputs.
The reason it does not really work is because the leakage current is
a DC spec, and the capacitor can only reduce the AC impedance.
So in short, one is a DC spec and the other is an AC spec.
For example, with a 100k resistor and a 1uf cap, the ADC leakage
current affects the reading due to the 100k resistor times the max
leakage current. With 100k resistor and a 1000uf cap (1000 times
bigger cap) the leakage current still affects the reading in exactly
the same way. On the other hand, if the resistor is dropped to
10k the voltage generated by the leakage current is 10 times lower
than it was with the 100k resistor. That *may* make a big
difference when the device has to operate over the full temperature
range. If you are running at room temperature all the time it
may be ok of course if the system is calibrated, which often it is.
Yes, it can work for a small temperature range, but over a wider
temperature range the ADC reading may change and cause
unacceptable inaccuracies. This is something to be aware of.
Thus, one of the "precautions" would be to make sure the ADC has
the right source impedance, or a better way of putting it, the right
source resistance. If we are going to do it, might as well do it right.
In the ADC world the target is usually plus or minus one half bit.