• Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

understanding specification of instruments

Status
Not open for further replies.

Parth86

Member
Hello
I need help to understand specifications of measuring instruments. I understand what is accuracy but I don't understand what is precision
Example : voltmeter with 3% accuracy
Voltmeter has accuracy 3% if the measured value is 10v that means voltmeter might read between 7 or 13 volt
What is meaning of precision for measuring instrument?
Can some one help with example: voltmeter with 2% precision?
 

Tony Stewart

Well-Known Member
Most Helpful Member
When random noise affects multi digit readings such as frequency, the standard deviation is reduced by averaging and reduced by the square root (N) repeated samples. So averaging 100 samples reduces the jitter by 1/10

I suppose this is an example.
 

MrAl

Well-Known Member
Most Helpful Member
Hello
I need help to understand specifications of measuring instruments. I understand what is accuracy but I don't understand what is precision
Example : voltmeter with 3% accuracy
Voltmeter has accuracy 3% if the measured value is 10v that means voltmeter might read between 7 or 13 volt
What is meaning of precision for measuring instrument?
Can some one help with example: voltmeter with 2% precision?

Hi,

Usually the accuracy is a simple statement of how close the instrument can get to showing the actual quantity being measured.

For example, you have a calibrated voltage reference standard of 4.000 volts DC that is accurate to plus or minus 100uv. That means it can acatully be 3.9999v to 4.0001v. You read this voltage with a DC volt meter with 1 percent accuracy. 1 percent of 4 is 0.04 volts. so your reading will be between 3.96v and 4.04v. That is simply the actual voltage plus and minus the percent of the voltage is can be off by.
4.000*0.01=0.040, and 0.040 subtracted from 4.000 is 3.960 and added to 4.000 is 4.040 so the range is 3.960 to 4.040 volts.
Because of digital meters they usually also specify a 'count' that goes with that percentage. The 'count' may be "plus 2 counts" for example.
With the previous example and a meter that could read 4.000 volts with four digits like that, the range was 3.960 to 4.040 but because of this 'count' we have to add that to the upper range so we get a grand total of 3.960 to 4.042, because '2' is the count that must be added to 4.040 to get the final upper range figure.

Resolution is a different spec. That is the lowest quantity that the meter can 'resolve".
For example, if you have a meter that reads this small set of voltages over time:
1.000, 1.001, 1.002, 1.003

then it looks like the resolution is 0.001 because that's the smallest difference it can read.
Readings like this:
1.0001, 1.0002, 1.0003, 1.0004

would mean that the meter has 100 microvolt resolution because it can tell the difference between one voltage and another voltage that is 100 microvolts above that one.
 

Parth86

Member
Hi,

Usually the accuracy is a simple statement of how close the instrument can get to showing the actual quantity being measured.

For example, you have a calibrated voltage reference standard of 4.000 volts DC that is accurate to plus or minus 100uv. That means it can acatully be 3.9999v to 4.0001v. You read this voltage with a DC volt meter with 1 percent accuracy. 1 percent of 4 is 0.04 volts. so your reading will be between 3.96v and 4.04v. That is simply the actual voltage plus and minus the percent of the voltage is can be off by.
4.000*0.01=0.040, and 0.040 subtracted from 4.000 is 3.960 and added to 4.000 is 4.040 so the range is 3.960 to 4.040 volts.
Because of digital meters they usually also specify a 'count' that goes with that percentage. The 'count' may be "plus 2 counts" for example.
With the previous example and a meter that could read 4.000 volts with four digits like that, the range was 3.960 to 4.040 but because of this 'count' we have to add that to the upper range so we get a grand total of 3.960 to 4.042, because '2' is the count that must be added to 4.040 to get the final upper range figure.

Resolution is a different spec. That is the lowest quantity that the meter can 'resolve".
For example, if you have a meter that reads this small set of voltages over time:
1.000, 1.001, 1.002, 1.003

then it looks like the resolution is 0.001 because that's the smallest difference it can read.
Readings like this:
1.0001, 1.0002, 1.0003, 1.0004

would mean that the meter has 100 microvolt resolution because it can tell the difference between one voltage and another voltage that is 100 microvolts above that one.
Thank you for your detail explanation
Now I understand accuracy and resolution. Can you tell me What is meaning of precision for measuring
instrument? What is meaning of "Volt meter with 2% precision"
 

OBW0549

Active Member
Here is a good explanation of accuracy, precision, resolution and sensitivity. They all relate to how "good" an instrument is, but express different things.
 
Status
Not open for further replies.

Latest threads

EE World Online Articles

Loading
Top