Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Specifications Applied

Status
Not open for further replies.

Caltech

New Member
I have a hand-held 4 1/2 LCD, digital true RMS voltmeter. Specifications state for example: accuracy 200mV range +/-(0.05% rdg +3 dgt). I believe the max count (dgt) is 19999 or 9999, no matter which.

Question 1- does rdg refer to indicated value or refer to standard reading?

Question 2- does +3 dgt refer to 1/19999 * 3 or 3*LSD (least significant digit) of indicated value in introduction example?

Thanks, Mike :?:
 
DMM Specifications

Mike, here's a generalized write-up I did on meter accuracy. I'll add on some specific examples for your situation at the end.


METER SPECIFICATIONS

A digital multimeter (DMM) has two separate parts to its accuracy specification: ±percent of reading and ±digits. A typical specification may be ±0.1% of reading, ±1 digit. For a 3-1/2 digit meter, this is an excellent specification, for 0.1% ends up being a resolution of (0.001)(100) or 0.1v when measuring 100v on the 200v range where the typical reading would be "100.0". With a specification of ±0.1%, the actual voltage could be anywhere from 99.9 to 100.0 volts and still be within tolerance. If we were measuring 10v on that same 200v range (a reading of "10.0"), there would be little difference in accuracy. 0.1% of 10v is 0.01v, meaning that the error wouldn't hardly affect the display at all.

Compare this to an analog meter (VOM) such as the venerable Simpson 260. A modern Series 8 Simpson 260 is specified as ±2% of full scale on any range. So, on the 100 v range, the specification will be ±2% of 100 v or ±2v, regardless of the voltage being read. So, if you're reading 100v (full scale) on the 100v range, the actual specification will be the original 2%. However, if you measure 10v on the 100 v range, ±2v ends up being ±20% of the 10v reading! That's why you're always told to shoot for an upper-scale reading on an analog VOM, going for the lowest range without going off the scale in order to get the best accuracy from the meter.

Back to the DMM, it's that "±digit" specification that's the killer if you don't watch what you're doing. You have to add this second specification on to the %reading specification to see the full effect of the meters accuracy. For instance, that same 100.0 reading on the 200v range that gave us ±0.1v has to be modified to include ±1 digit. Since the last digit on our reading represents tenths of a volt, that means that the reading can bobble ANOTHER 0.1v up or down from that point for a total of ±0.2v. This gives an overall percentage accuracy for that reading of ±0.2%, still not bad at all for our meter.

It's when you try to read a lower voltage on that same 200v range that you can get into trouble. Yes, the overall accuracy specification was only ±0.01v in our example, but we have to add that ±1 digit onto that reading. This translates into ±0.11v total and for a 10v reading, this means that our overall accuracy specification just slipped to a lousy ±1.1%! Use the 20v range for this same 10v measurement, and you accuracy will pop back up the 0.2%. So, the same rule applies to DMMs as it does to analog meters: use the lowest range that you can without going overrange for the best accuracy.

The ±digit portion of the spec will usually be minimal for the higher dc voltage ranges and begin to increase as you go to the lower ranges. The ac ranges, even the upper ones, will probably have a higher figure here to begin with and really get nasty as you hit the lower ranges.

A DMM is basically a dc voltmeter with a basic 200mv or 2v range with a lot of conversion circuitry to get our other ranges and functions. This is why the dcv spec is most accurate. dca is probably the second-most accurate function as better meters have laser-trimmed current shunts. The acv function has the added fudge factor of the ac-dc converter which often skews a basic ±0.1 dcv spec to ±2% or more for acv readings. A lousy meter right off Radio Shack's shelf may have a basic dcv spec of ±0.5% and an acv spec of ±4%, getting worse as the frequency of the ac voltage increases. The ohmmeter conversion circuits are often even worse than the acv converter. Watch your specs closely as you move from function to function. Just because a DMM is "deadly accurate" when measuring dc volts doesn't mean that it can accurately measure a 1% resistor for being within tolerance!

DMMs aren't always more accurate than analog meters in some situations. For instance, an older Simpson 260 or Triplett 630 has a 5000v dc range. If you're measuring 1000vdc in a high-impedance circuit, the DMM with its typical 20M ohm input resistance could very likely load the circuit down more than the analog meter which has an input resistance of 100M ohms on the 5KV range! The DMM in that case might indicate an actual voltage of 500 volts as being 240v while the analog meter might show it more accurately as 490v.

A lot of stuff goes into determining the accuracy of your meter reading. Always the basic specification of ±% of reading/±digits comes into play. For ac and dc voltages, the input resistance can affect high-impedance circuits. For ac and dc current ranges, the shunt resistance can affect the actual current reading in a circuit. Ac voltage and current ranges will have the frequency as a major factor in accuracy and unless the meter is a true-rms meter, the waveshape will affect accuracy.

Meters will often have a display that resolves more digits that the meter's specification can handle, and this is a personality flaw that extends across all brands, including Fluke and Agilent (formerly Hewlett-Packard) meters. In actuality, a 4-1/2 digit meter is resolving to 0.005% of its range, so if the basic accuracy of the meter is ±0.05%, that last digit is in question. It's OK to use that least-significant digit for relative trends and changes in a measurement, but you really can't use if for a quantitative reading, i.e., you can't treat it as deadly accurate, because the basic meter specs won't allow that. Most DMMs, even the best are this way, and even their most-accurate ranges/functions will reflect a display that can resolve a little better than the actual a/d converter that drives it, and that's OK. But watch out if you shift over to the acv function where a specification might slide to 1% or greater. That means that you've got to throw out the last two digits of the display for anything but casual observation. If manufacturers were faithful to their specifications, they would blank out falsely-resolving digits when they shift to ac or resistance functions of the meter. Of course, that would be lousy marketing, so that don't do that.



Now, for your particular situation of a 4-1/2 digit meter with ±0.05%, +3 digit specification on the 200mv range. It's probably because of the low 200mv range that you have that +3 digit specification, most likely because of a noisy ac-dc converter assuming that you are measuring ac as written in your question. This means that a reading of 100mv on this range (a display of "100.00") will have a basic accuracy of ±0.05% of 100mv or ±0.05mv (50µv). If the 100mv were deadly accurate, your meter could read as low as 99.95mv and as high as 100.05mv. But when you add on that other specification, you have to add 3 counts to the outcome, changing the final expected reading to go from 99.98mv to 100.08mv.

Hope I didn't bore you too much!

Dean
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top