Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Worth calibrating a multimeter?

Status
Not open for further replies.

Revolvr

Member
Is it worth ever re-calibrating a multimeter used at home for hobby?

I'm familiar with commercial standards like ISO-9001, TL-9000 which would have even a lab multimeter calibrated yearly or so.

I have a Fluke-77 I bought in 1986. Works like new and I assume it's right - but I've never checked it since I bought it.

TIA

-- Dan
 
Depends on what you are testing, and working on. For most home and hobby stuff, close enough is good enough. Basically, if you measure a resistor, and it reads within it tolerance, I'd say its good enough.
 
I bought a Radio Shack analog meter at a yard sale (VTVM design using FETs) it must be over 20 years old but it still agrees closely with my new Fluke 73III. If you are doing work where it has to be right, calibration is a must, but for hobby work, you will know if the meter fails because it will be way off.
 
OK - calibration ISN'T about your meter (or scope, or whatever) being accurate - it's SOLELY about having a paperwork trail to show it's been calibrated. For home use it would be a complete waste of time, unless your entire home is run under ISO9001? - which is simply a paper trail system, not a quality system of any kind. Products made under it might be completely useless, and regularly fail, but as long as the company have a nice paper trail in place it meets all the requirements.

Bear in mind, if you buy a brand new meter or scope, you still have to send it away to be 'calibrated', or you fail your ISO9001.

I wouldn't consider sending a meter away for calibration, either at home or at work!.
 
hi revolvr,
Equipment calibration checks can be expensive, especially if carried out by certified test houses.

The Cal certificate is mainly for the commercial environment, where its 'supposed' to ensure standardisation and traceablity.

For example:
if on your documentation, for setting up a piece of equipment that you have manufactured, it says, set this "Test Point to +3.99v" it assumes that you have used a calibrated DVM to determine this voltage during manufacture.

As already suggested by Nigel, sadly a number of companies use the IS9001 rating as a PR exercise and as
an 'insurance' policy of whom to blame when things go wrong.

You can create your own 'sub standard' test sources for your hobby test equipment if you feel its required.
eg: A high quality 6.20V zener, with a 1% MF resistor etc.

Regards
Eric
 
In some situations the lack of a formal system of calibration can be very expensive. In a large organization - or between companies that put together sub-assemblies - subtle differences can be significant. The paper trail adds little value - I'll agree with that. Ignoring the process entirely could be a huge mistake.

I would agree that for home use and for many commercial applications, that formal calibration adds little value. What you really should do though, is periodically check your equipment against known points of reference. I have some friends who work with high quality equipment - they will calibrate their personal equipment -and I'll check mine against theirs on occasion. Quite often the result is "close enough". If there was significant error you might then choose to make some adjustments- or investigate if the error seems like more than just calibration.

As already suggested, creating your own, inexpensive bench standards might not be all that difficult. You might purchase some precision resistors, zeners, and so forth. Batteries in good condition are relatively reliable voltage sources. You might occasionally compare your equipment against your standards - particularly when you are getting readings that leave you wondering. This isn't really calibration but it's actually quite sufficient for many applications.
 
stevez said:
In some situations the lack of a formal system of calibration can be very expensive. In a large organization - or between companies that put together sub-assemblies - subtle differences can be significant. The paper trail adds little value - I'll agree with that. Ignoring the process entirely could be a huge mistake.

Only a mistake if you're supposed to be meeting ISO9001 - and even in that case, the certification is only valid for the moment they tested it, by the time you get it back it might be worse than before you sent it - you've no way of knowing!. For many meters in a company, it would be a good idea to make regular comparisons between them, so you would catch a potentially faulty one, this would be far more useful than any yearly 'calibration'.

I would agree that for home use and for many commercial applications, that formal calibration adds little value. What you really should do though, is periodically check your equipment against known points of reference. I have some friends who work with high quality equipment - they will calibrate their personal equipment -and I'll check mine against theirs on occasion. Quite often the result is "close enough". If there was significant error you might then choose to make some adjustments- or investigate if the error seems like more than just calibration.

In most meters how would you make such adjustments?, they don't normally have any adjustable components inside? - at least not for individual ranges!.
 
My experience with ISO is related to knowing what's important about your processes, establishing what is critical (and what isn't) and doing it. If you make measurements and have determined that they are critical, calibration might be what offers you some assurance or confidence that your measurements can compare to a known point of reference. Maybe there's another word for it but calibration was around long before ISO - and has a place regardless of ISO. If the accuracy of your instrumenation wasn't important and it was part of ISO then it would seem that someone made things more complicated than necessary for ISO.

The calibration of a DMM or other instrument might be more of a go/no-go - if no adjustments are possible. Where the accept or reject point might be would depend on the application. If you are not willing to discard a DMM or whatever, then knowing the amount of error might allow some correction to be applied to measurements. I can imagine not being too worried about it a home but in a commercial, industrial, or other enterprise it would seem that knowing where your instruments are as compared to accepted standards is often very important. Not sure what that is if not calibration.
 
Nigel Goodwin said:
OK - calibration ISN'T about your meter (or scope, or whatever) being accurate - it's SOLELY about having a paperwork trail to show it's been calibrated.

whaaa? uhh no, that's not right. calibration is solely about your instrument being correct based on a traceable standard. Nobody calibrates for the sake of showing it's been calibrated.
 
Nigel Goodwin said:
In most meters how would you make such adjustments?, they don't normally have any adjustable components inside? - at least not for individual ranges!.

Calibration constants are applied in the processor to generate the final "reading".. adjusting components is a 1970s era way of doing things. These days everything has a micro in it.

The calibration constants are developed at the time of manufacture and are stored on board. Pretty easy.
 
HarveyH42 said:
Basically, if you measure a resistor, and it reads within it tolerance, I'd say its good enough.
I agree, but select a new 1% resistor and measure a range of them, 100:eek:hm: is a good value since you should expect it to read between 99:eek:hm: and 101:eek:hm:.
 
Nigel Goodwin said:
OK - calibration ISN'T about your meter (or scope, or whatever) being accurate - it's SOLELY about having a paperwork trail to show it's been calibrated. For home use it would be a complete waste of time, unless your entire home is run under ISO9001? - which is simply a paper trail system, not a quality system of any kind. Products made under it might be completely useless, and regularly fail, but as long as the company have a nice paper trail in place it meets all the requirements.

Bear in mind, if you buy a brand new meter or scope, you still have to send it away to be 'calibrated', or you fail your ISO9001.

I wouldn't consider sending a meter away for calibration, either at home or at work!.
Hello seem you know about meters, can you help me, trying to find the comparison of analog/digital displays on meter, between their LCD, LED, bargraph, lamps and buzzers.


Thankyou Quest
 
Hello seems you know about meters, can you help me, trying to find the comparison of analog/digital displays on meters, in their LCD, LED, baragraph, lamps buzzers.

Thankyou Quest
 
Quest said:
Hello seems you know about meters, can you help me, trying to find the comparison of analog/digital displays on meters, in their LCD, LED, baragraph, lamps buzzers.

An analog meter is nice because it "filters" the voltage and you can read the average of a varying voltage. An autoranging DVM is useless when the voltage is not constant. A digital meter is good because it is much more accurate than an analog. An LCD display with backlight is good but hard to read in dim light with no backlight. LCD is low power so is used with portable, battery operated tools. An LED display useses more power but is easy to read. The bar graph has limited resoluton and is difficult to interpret, good for level indication.
 
OK, I have to weigh in on this one as I have the background in U.S. Navy PMEL (repair and calibration of test equipment), as a Tektronix bench technician doing the same and having been contracted to develop a calibration lab's system for satisfying ISO9002 standards. I say ISO9002 vs. ISO9001 because as I recall (it's been at least 15 years since playing with ISO standards), ISO9001 is for manufacturing while ISO9002 covers service. As I recall, anyway. I am getting aged, you know!

1) Calibration is the adjustment or check of the instrument to insure that either it is within specifications or is adjust to be so.

2) Certification is the documention and traceability to national standards that accompanies a calibration and optionally may have before-and-after test data involved.

3) Instruments that have no adjustments or few adjustments (DMMs, VOMs and VTVMs are typical) are checked to insure they are within specifications on all ranges after the few basic (and only) adjustments are made. Out of spec ranges either involve repair or discarding of the instrument, depending upon the cost.

4) ISO9002 requirements are NOT just for PR. Unfortunately, a small company may have high quality standards that are routinely sent off to a cal lab and good repair and calibration practices, but unless they either have ISO9002 certification or can set up their business so that it can fall under the umbrella of a larger company's ISO9000 certification, they cannot do business with that company or any other companies who tout ISO9000 compliance. In most cases in the U.S., it's no big deal because ISO9000 isn't a life-of-death system here. But in Europe, it may be a different story.

5) A large company doesn't have to send its test equipment out to be calibrated/certified. If they have their own internal cal lab (a very expensive proposition and few do), they only have to send out their standards and high-end equipment that's used to calibration the house equipment. If they don't have their own internal lab, they can contract out for on-site service from most major calibration sources, such as Tektronix, Agilent (the former Hewlett-Packard), Fluke or many larger independent labs. Same service, same results and often less expensive than sending out equipment. And the down time is nil.

6) Digital meters aren't always as accurate or precise as analog meters. Although they aren't made anymore, the ORIGINAL "Fluke" meters were differential voltmeters, analog and accurate to 0.01% typically. And when they were "at null", their input impedance was infinite! They were a little slower to use that a modern DMM, but they had some other advantages, too.

7) As to the benefit of formal calibration for hobbyists? Rubbish, as most of us have chimed in. Internal checks with decent standards (untraceable but reliable) is good enough. Most high-end DMMs will hold their specification for several years. Most will hold more than tight enough for a decade for hobbyist use.

8) Nuts to this zener-as-a-voltage-standard stuff! There are plenty of voltage reference chips out there from Analog Devices and Maxim that are accurate to parts per million (rather than a percentage) and are either inexpensive to purchase from a vendor or are available as free samples from the manufacturer. Use THOSE!

Dean
 
I don't know why they don't just get the expensive meters calibrated, use them to check that the cheaper meters are still in rage and throw them away if not.
 
Generally speaking, the cheaper meters would be used more for troubleshooting vs. adjustments to equipment, and that would be a better idea. Just use a comparison check to insure the cheap meters are OK. However, you would need to check to make sure that a procedure would pass ISO9000 guidelines and so forth.

A few years ago, there was a guy who must've saved up a pile of money and he bought a high-end Hewlett-Packard (then) DMM, maybe a 6-1/2 digit model. He promptly began offering "calibration checks" for your meter for a small fee. Hmmmmmm.

Dean
 
I don't know why they don't just get the expensive meters calibrated, use them to check that the cheaper meters are still in rage and throw them away if not.
That is exactly what we did, we had a Tektronix 475a/dm44 oscilloscope and a 8060a Fluke calibrated to a traceable standard and used these instruments as standards to check the other meters and other scopes.
In 99 % of cases all meters passed their tolerance checks and almost every scope passed their tolerances and we noted the failures. This satisfied ISO9001 standards and allowed us to trade with other companies that insisted on trading with ISO9001 accredited companies. Proper record keeping is essential as Nigel said to satisfy the papertrail. Did it enhance our fault finding techniques? NO
However, when you are in dispute in a multi vendor environment, it is paramount that you can stand by your findings and have faith in your equipment. For that reason alone we often hired specialised equipment with traceable calibration standards.

I have a cro, fluke and analogue meter, none are calibrated but I have faith in their accuracy (within limits) as I frequently use them and any great discrepancy would show up in cross comparisons.

Cheers RH
 
No but it is worth checking it!

Goodday,

I have about 35 years of using a meter, and I am only 43 now! Yep I started about 8 years of age, and spent a good part of my time in industry.

Is it worth calibrating a multimeter?

My response is no, definately not worth calibrating a meter. And I leave my self open to some prof saying nonsense, but I am used to standing alone, in knowing my truths.

By calibrating a meter, what you are saying is that you or someone else know more about your meter, than the people at the factory that made it and QC it.

Rest assured, no matter how large the external organisation, the quality of measuring devices, won't be one patch on what the factory has at base!

We had this debate, when I first started out in work, people wanted to frigg their Citizen Band Radio's to over drive the power transistors, so as to get more effective powers out of their rigs, they did all sorts of nonsense, tweeking all sorts of ferrite cores, until eventually, the transistor packed in, and they came to the shop for repairs: often, they never even bothered using heat sinks. Daft fools!

By a neglection to ascent to calibrate a meter, that does not mean that it should not be validated, as correct. When one refers to the term calibration, one tends to imply correction. On the ole favorite meters, AVO RIP thank god! calibration used to involve a series of measures and charting to establish the errors of performance. A standards of deviation sheet, if you like, just like a log table, you add or subtract the error from the sheet.

Now how many people do you know, in engineering, that on every measurement they make, dig out on ole filthy scrap of paper, well used, to find the error correction factor to make to their statement? Not many, I guess, not even the big boys, with ISO practice standards, I assure you.

Someones gonna mail me back direct now, and say Hey dude, I do! You will be the first fellow I have met that does!

When one states calibration, what we understand that to mean is that it is corrected in some way to comply to a standard. That can be manual error correction using a frig factor via a sheet, or it can be automated, by making adjustment to the performance of the item. In any case, it implies that some kind of alteration is being made to the factory norm, for the purposes of shared norms and conventions of standards outside and independant to the manufacturer.

If your meter is adjusted by a third party, your meter is no longer warranty covered. And you won't know where the devil you are, whats been changed and what ain't!

When you get your meter, you should check to see that the meter agrees with known quantities, of course, at least so you can be familiar to its functional use in application. In other words satisfy your self about how the meter functions, and its limits.

What calibration does is ensure, that a factory full of meters from one supplier, don't differ in performance or reading. If your meter makers are worth their claims, and most Eastern Countries consider their work an honour of service, your meters, how ever many you have, are not going to deviated from the factory settings, unless acted on by another source.

When someone comes to you from another source, with another make of meter, your references and readings should be roughly the same, and differences in the makers will eventually reveal themselves, naturally, if there is any point to be taken about the functioning of your meter in performance, in comparison to elsewhere, only then will it be a matter for the makers of your meter to deal with, and correct. If you get the meter calibrated, all you get done is transplant the readings from one source reference, out of a context to another source.

Your personal checks will ensure conformity to specifications.

At present, I am exploring tempretures, thus far I find, the clinical thermometers, with a sensitivity of 0.1 degree C are in point of fact 2 degrees above the norms of the conventional standard! It seems to me they use some kind of NOBO Daddy reference, complete with the Hole pie chart as well! NOBO Daddy says all the other thermometers are wrong, look how accurate this clinical device, is: sure your items ain't above PA. I pull out a 100 degree, brewery thermometer, which to all intents is clinical grade, and say, hey that matches my readings, on all other meters, and they all measure by different thermo means, some alchohol some Infra Red, some thermocoulple, how come the Clinical standard is two degrees +, and the K probes match the Clinical standard! Maybe K probes get calibrated to medical thermometers? Not sure how you do that at 400 degrees C, Coliforms only read up to 46 usually. I am one degree spot on point, with a General purpose meter, and I say some NOBO DADDIES BEEN MESSING ABOUT making their K probes match readings of clincal thermometers: having a play with things better left to the factory standard!

Definately not! Don't get your meter calibrated, DO check your meter readings against known quantities.

For thermometery I can't do better than a traditional made thermometer, alcohol or mercury, they calibrate them against a known good standard.

My probes match that! I know its field of use, and know its better than the 2 percent error of standards of International Conventions.

If I use a clincal thermo, it will also agree with its field of usage, albeit 2 degrees above truth, but accurate to 0.1 sensitivity. within that field.

Far be it from me to redefine the natural human body tempreture to 35 degrees, but I must ask the question, on whose body did they calibrate the reference point!

I new they called me a cadavere at birth for some reason!

David Christmass
 
Status
Not open for further replies.

Latest threads

Back
Top