Hi Ron,
Oh yes, very good point.
On another related note, the resolution of 1v might not be good enough, but we dont want to have to move to a higher resolution ADC. What to do?
Well, the standard measurement technique is mostly linear, but if we move away from a purely linear measurement technique we can get higher resolution provided the range of input measurment is restricted. For example, the line itself can range from 0v to 140v roughly which is a range of 140v, but most of the time the line is from 80 to 140 volts, which is only a range of 60v. Using a subtract circuit or other non linear circuit we can subtract 80v from the 0 to 140v line which gives us range of 0v to 60v which represents the line being 80 to 140 volts.
A simpler example using an 8 bit ADC though is this:
Say the line is 0 to 256v. Using a standard linear technique we get 1 bit per volt, which is a resolution of 1 volt. But this line only varies from 128v to 256v. That's a cut in the range by 50 percent, and now if we subtract 128v from the 0 to 256v line we get 0 to 128v, which represents 128v to 256v. Now we get a better resolution of 0.5 volts.
To put it another way, with the pure linear technique we might measure 128v or 129v, but we cant measure 128.5v. With the subtractor, we can measure 128v but also 128.5 volts, and of course 129v.
In a real circuit we might divide down the 0 to 256v to 0 to 10v, then subtract 5v, or perhaps divide down to 0 to 5v, subtract 2.5v, then amplify by 2 to get back to 0 to 5v which represents 128v to 256v.
The disadvantage of this technique is that we can no longer measure voltages below 128v, but we did it knowing that we would never have to. If necessary though, we could call 128v an 'under-range' voltage and trigger an error or alarm.