How much input voltage protection for ohm meters ?

Status
Not open for further replies.

Grossel

Well-Known Member
Hi forum.

In a local forum (not particularly focused on technology) I had a discussion with one guy needed help identify wich wire is what (a car) and I told him how to use an ohm meter for doing this.

I also warned against measuring (in ohm mode) on parts that may be put under voltage. This may be stupid because I haven't in fact making any reading about the internal circuitry of a standard multimeter, just assuming it provide a constant current source and a simple voltage meter, and that a voltage applied on the test leads may harm the multimeter.

Then of course, there comes this other forum user being totally sure that it is totally safe for any multimeter to put the test lead (in ohm mode) over a 12V battery and that the multimeter cannot get any damage from this.


So my question : are all multimeters equipped in such a way that in ohm-mode, it is capable to withstand input voltage ?

Or to be more precise : are there a standard that the manufacturers of multimeters tend to follow, or any papers that can support that claim ?
 
I would suggest there's little or no protection, it's designed for people who understand what they are doing - and I certainly wouldn't consider using ohms ranges on any powered circuit, never mind a car battery.

It 'may' be OK, but it 'may' OK if you hit it with a hammer, and I wouldn't do that either
 
Well, the core issue is the other forum user are very confident in his case, and I cannot point to any source to prove him wrong . . .
 
Ok, so basically Fluke should be well protected, and for any other meters one must refer to the manual.
 
For any meter, refer to the manual. Assumptions can be expensive ($5 DMMs aside). I would not measure the resistance of a battery with either of my Fluke meters.
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…