• Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Difference in behavior of Analog Voltmeter and Digital Multimeter

Status
Not open for further replies.

Kishore Mantha

New Member
While checking an industrial battery (Exide make) of 2.2 volts analog voltmeter of 0-3V range showed no leakage current between +ve pole and lead ribbon put over the rubber bag containing electrolyte. But a digital multimeter showed a reading of 1.2 Volts. This difference is observed only in one of a total 110 cells. In the rest both digital and analog showed no difference in reading. Also the resistance between the lead ribbon and -ve pole is less than 0.5 MΩ while for the rest of cells it is more than 100MΩ and they showed no leakage voltage. Since there is a drop in insulation there should be leakage votage but still analog meter showed no deflection except a small hunt. Can anyone clarify this anamoly
 

smanches

New Member
My guess is that the digital meter is using a hall effect IC for current measurements which is much more sensative at low current than a coil winding in the analog meter. Do they both measure the same current on a known good current source? Same voltage?
 

Reloadron

Well-Known Member
Most Helpful Member
What is the sensitivity of each meter? Here is what I am getting at:

Analog Voltmeter:

One of the design objectives of the instrument is to disturb the circuit as little as possible and so the instrument should draw a minimum of current to operate. This is achieved by using a sensitive ammeter or microammeter in series with a high resistance.

The sensitivity of such a meter can be expressed as "ohms per volt", the number of ohms resistance in the meter circuit divided by the full scale measured value. For example a meter with a sensitivity of 1000 ohms per volt would draw 1 milliampere at full scale voltage; if the full scale was 200 volts, the resistance at the instrument's terminals would be 200,000 ohms and at full scale the meter would draw 1 milliampere from the circuit under test. For multi-range instruments, the input resistance varies as the instrument is switched to different ranges.
Digital Voltmeter:

Digital voltmeters (DVMs) are usually designed around a special type of analog-to-digital converter called an integrating converter. Voltmeter accuracy is affected by many factors, including temperature and supply voltage variations. To ensure that a digital voltmeter's reading is within the manufacturer's specified tolerances, they should be periodically calibrated against a voltage standard such as the Weston cell.

Digital voltmeters necessarily have input amplifiers, and, like vacuum tube voltmeters, generally have a constant input resistance of 10 megohms regardless of set measurement range.
If I had to guess I would say your leakage current is not enough to drive the analog meter movement while the leakage current can easily be measured using the digital voltmeter. This results in the analog meter not reading and the digital meter reading. If your digital meter has a current range down in uA you could try measuring the leakage current. Does that make sense?

Ron

The quotes I used were taken from here.
 

KeepItSimpleStupid

Well-Known Member
Most Helpful Member
Its all based on input impeadance. The meter creates a voltage divider between its input Z and the source. 10 M is common for multimeters. For system DMM'S and electrometers it is quite high. 100 G ohms or higher. Some meters have an ohms/V rating while others will have a different Z for the low ranges.
 

MikeMl

Well-Known Member
Most Helpful Member
The input impedance of a crappy analog meter is 5KΩ/V; a good one is 20KΩ/V; the best one I ever saw was 50KΩ/V.

My Fluke DMM is 20megΩ on all Voltage ranges.
 
Status
Not open for further replies.

EE World Online Articles

Loading
Top