Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Real world test measurements?

Status
Not open for further replies.

SomeoneKnows

New Member
I’m trying to learn about electronics and have bought a number of books on the subject. I seem to learn better when I can apply things to real world applications. I wanted to start out with something seemingly simple. I want to tap into my Jeep's temperature sending unit where the dash gauge hooks into.

I know the temperature-sending unit is a variable resistor. I am trying to get some accurate measurements of the ohm values. I found that I get different readings from my ohmmeter if I use the grounding strap on the firewall vs. when I use the negative post on the battery. I understand now why this would happen because there is additional resistance caused by the longer path back to the battery through the vehicles body, grounding strap connections, and wires back to the battery.

I am also getting different readings from my ohmmeter when the engine is running versus off. With the temperature gauge in the dash showing somewhere around 215 degrees I am getting a reading of 65 ohms with the engine running and 158 with the engine off. The factory service manual says I should be reading 93.5 ohms at 220*F temperature. I don’t understand how to get an accurate reading regardless of whether the engine is running or not.

I would like to hook up both the ohmmeter and the factory gauge at the same time to analyze the data under normal operating conditions. I know the gauge works by comparing voltages between a constant source and the voltage through the sending unit. Maybe I need to rethink my approach. When I try getting a reading with my ohmmeter from the sending unit and hook up the dash gauge wire to the sending unit, the ohmmeter goes to an open reading. I assume this is happening because the circuit is being completed through the instrument cluster gauge. Is there a way to continue sampling data from the thermistor while the gauge is connected too? Is it possible to test the resistance within a circuit while the circuit is powered? Would putting diodes in the circuit let me take readings with both or is there a better way? I want to learn how to hook into the factory sensors without adding new sensors to do the same thing.

The books I have so far don’t seem to go into real world problems like this and it is hard to ask a book a question. I have the book “How to test almost everything electronic” but I didn’t find clues to these questions. That book seems to be geared more toward televisions than automotive applications. If anyone has a book suggestion I am interested in that too.
 
Perhaps the solution to all your confusion is in the vast number of variables you introduce when anything (other than the sending unit) is connected into your circuit; including the internal resistance of your meter. You have varing voltage, the car's wiring, and the effects of a new circuit when you connect your meter. While your interest is great, you might consider a finite circuit (maybe on a breadboard card) where you can controll and take into consideration ALL the variables.
 
First off, as a general rule, it is not advisable to connect an ohmmeter to a circuit under power!

Now to attempt to address your question regarding the different readings...

Where the manufacturer specifies a resistance at a given temperature, he is referring to a specific set of test circumstances. Typically, tests like this are done by immersing the sender in a water bath with the ohmmeter leads attached and a thermometer in the water as well. Heat the water while monitoring the changing resistance through the sender and the water temperature, noting the resistance readings at the specified test temps given.

The most accurate way of evaluating the sender is, of course, with the sender out of circuit. Depending upon the internal circuitry of the dashboard unit, you will likely obtain misleading readings if you attempt to measure the sender's resistance while it is connected in circuit. Also, depending upon the type and amount of thread sealant used on the sender, you may find some strange resistances between the sender body and the cylinder head or engine block. Any resistance in the run from the dash unit's "sender" terminal to chassis ground will affect the calibration (read: accuracy) of the dash gauge.

There are two basic dash gauge circuit types. In the first type, grounding the "sender" lead yields a maximum gauge reading, while opening that lead yields a minimum gauge reading. The second circuit type operates in reverse of the first type, with a grounded "sender" lead producing a minimum gauge reading; opening that circuit will produce a maximum gauge reading. Normal operating range is somewhere in between these two extremes. Considering the first gauge type, the resistance through a serviceable sender will be "high" when the sender is cold, and the resistance will decrease as the sender heats up.

When you measure the resistance through the sender with the sender connected to the gauge, you are actually measuring the overall (effective) resistance between that point in the circuit and ground. In reality, this overall resistance is made up of two separate resistances in parallel -- the resistance through the gauge's internal circuit to ground and the resistance through the sender to ground. Of course, as already stated, this type of measurement should be made only on a non-powered circuit. The reading obtained here will be lower than you might expect, as the resistances are effectively in parallel to each other when measured in this manner. The "product over sum" rule for parallel resistances assures us that the effective resistance will be less than the lower of the the two values in the parallel network.

For example, supposing a gauge resistance of 100 ohms and a sender resistance of 50 ohms (purely arbitrary values selected for this example), Rp = (R1 x R2)/(R1 + R2) = (100*50)/(100+50) = 5000/150 = 33.3333 ohms. The gauge resistance considered here is the resistance through the gauge from its sender terminal to its ground terminal.

Does this help any? :)
 
Gene said:
Perhaps the solution to all your confusion is in the vast number of variables you introduce when anything (other than the sending unit) is connected into your circuit

Ok, I think I see what you mean. I understand that the purest measurement would be obtained with the sending unit removed from the vehicle. I didn’t think about just checking the resistance between the engine block and the battery ground. I wasn’t expecting that to amount to anything (another lesson learned).

I suppose the reason my readings were different could be caused by additional (stronger) circuits to the battery ground are “opened up” when the engine is running. I guess that could account for the lower resistance I viewed when the engine was running.
 
ChrisP said:
First off, as a general rule, it is not advisable to connect an ohmmeter to a circuit under power!

I looked back through my books and find ohms law and resistor color codes described most often but didn’t find that little detail. I’m sure that is second nature to everyone here, if there’s a wrong way to hook something up I seem to find it. Ok, so the approach I was using to try and read useful data was wrong. I see your point about the variation of resistance introduced through the gauge too.

I read in the Jeep service manual that the dash gauge works by comparing voltages between the constant supply voltage and the voltage through the sending unit. I guess I should be trying to read the voltage in the sending unit circuit instead of the resistance. A function of ohms law anyway, right?

I’ll try checking the voltage at the sending unit tomorrow with the circuit energized to see how it changes as the temperature changes. By sampling the voltage with the voltmeter, does this affect the voltage the temperature gauge reads causing a deviation in the temperature gauge too? I ultimately want to find a way of using the same sending unit for the analog gauge while simultaneously collecting the data digitally.

And yes, your post did help me understand what was happening. Thanks.
 
I'd guess that the ohm-meter current was too high and that is throwing things off. I was doing the same thing - getting confusing results and convinced myself that the sensor was bad. A new sensor behaved like the old one. Fortunately I had a factory manual and before spending even more money I decided to read it - and learned that I needed to use the diode function on my DVM. I got readings that made sense - there was some error, likely explained by previous posts - but I was in the ballpark and it behaved as expected when dipped in hot water, cold water, etc.
 
SomeoneKnows said:
By sampling the voltage with the voltmeter, does this affect the voltage the temperature gauge reads causing a deviation in the temperature gauge too?

The answer to this question lies in the design specifications of the voltmeter in use -- most specifically in the (DC voltage) sensitivity of the voltmeter.

Whenever a voltmeter is connected across any portion of an active circuit, the circuit "sees" the voltmeter as an additional "external" resistance that has been added in parallel. This is of course exactly what is actually happening. The voltmeter, when connected for example between the sender's gauge terminal and ground effectively places an additional parallel resistance into the circuit.

Your Jeep manual explains that the gauge circuit works by comparing voltages dropped accross the gauge and the sender, right? OK -- that much is correct, but let's examine it just a little bit more closely...

Assume that the gauge has some fixed internal resistance through its meter coil(s), and that the resistance through the sender changes with temperature. At any given temperature, the sender will then also have a "fixed" internal resistance. The ratio between the internal gauge resistance and the sender internal resistance at any given temperature of the sender will be the same as the ratio between the voltages dropped across those resistances.

Let's look at some numbers. The values given here are purely arbitrary and were in fact chosen for convenience in this example; real-world values will likely be very different. Assume a nominal +12VDC as our source voltage, which is already a departure from real-world, in that the auto electrical system actually operates as somwhat above 13 volts. Further assume a gauge internal resistance of 20 ohms, and at some given temperature, a sender internal resistance of 100 ohms. The effective series resistance through this gauge circuit is then 120 ohms, which yields a current of 0.1A with a 12V source. The voltage dropped by the gauge would then be 0.1A x 20 ohms = 2.0 volts. The voltage dropped by the sender would similarly then be 0.1A x 100 ohms = 10.0 volts.

As the sender changes resistance with temperature, the effective series resistance changes, with a resulting change in current. Suppose that the sender resitance changes to a low value of 20 ohms. Now the series resistance is 40 ohms (the sender's 20 plus the gauge's 20) and the circuit current is then 0.3A (12V / 40 ohms). Individual voltage drops accross the gauge and sender are the same at 6V (20 ohms x 0.3A). Now suppose instead that the sender's internal resistance were increased to 220 ohms. Series resistance would then be 240 ohms, and the current would then be 0.05A. Voltage drop across the gauge would now be 1V (0.05A x 20 ohms), while the sender would drop 11V (0.05A x 220 ohms).

Got all of that? OK -- it is the changing current that produces gauge (meter) deflection by changing the effective strength of the magnetic field produced by the meter coil.

Now to the point that started all of this -- the effect of placing a voltmeter in parallel with the sender. You should be aware that not all voltemeters are equal as regards accuracy. Simply put, the higher the sensitivity of the voltemeter, the more accurate it will be. Voltmeter sensitivity is normally stated in "ohms per volt" form, and this information can usually be found either on the meter face or in the meter documentation.

Remember the example above of the sender with 100 ohms internal resistance? We showed in our example that the sender would drop 10V in our sample circuit. This means that the voltage at the sender's gauge terminal, when measured with respect to ground, should nominally be 10 volts. What would happen if we used a really cheap voltmeter with say 500 ohms per volt sensitivity? The voltmeter, when placed across a portion of a circuit, "loads" or draws current from that circuit. The amount of current drawn is dependent upon the meter's sensitivity. The higher the meter sensitivity is, the less current will be drawn by that meter. For example, a meter with 1000 ohms per volt sensitivity will load the circuit roughly 20 times as much as a meter with 20,000 ohms per volt sensitivty would.

Consider the efffect of using a (really cheap -- seven bucks at the auto parts store) 500 ohms per volt meter, set to its 15VDC range. This meter, when placed across the sender, would be roughly equivalent to placing a 7.5 K-ohm resistor in parallel with the sender. The circuit current would change, as would the voltage drop across the gauge, right? Granted, it would not be by very much -- the effective parallel resistance of the voltmeter/sender combination would be 98.68 ohms, giving us a net circuit resistance of 118.68 ohms, resulting in a circuit current of 0.1011A, thus dropping 2.022V across the gauge and effectively leaving 9.978 at the sender's gauge terminal. This gives a calculated error of 0.022V or 0.22%.

Crunch the same numbers for a typical 20,000 ohms per volt product like the $25 Radio Shack 22-221. This meter places an effective resistance of 500K-ohms when set to its 25VDC range in parallel with the sender, giving a total Rp for the meter/sender combination of 99.98 ohms and a total circuit resistance of 119.98 ohms and a circuit current of 0.10017A. This equates to a voltage drop across the gauge of 2.00034V, leaving a calculated 9.99966V at the sender, an error of 0.00034V or 0.0034%.

BTW -- The additional current (over our original 0.1A from the example above) in each of these two scenarios is the current drawn by the voltmeter -- 0.0011A with the 500 ohms per volt version, and 0.00017A with the 20 K-ohm per volt model.

While these errors are negligible at the voltage and current levels present in this circuit, I am sure that you can see how they can become important in other circuits.
 
ChrisP said:
The answer to this question lies in the design specifications of the voltmeter in use -- most specifically in the (DC voltage) sensitivity of the voltmeter.

I didn’t find the ohms per volt rating on my meter. I bought it at an auction a few years ago and the documentation didn’t come with it. I have a Wavetek DM5XL and did a Google search this morning and found a manual online.

First paragraph in the manual under resistance testing says, “Turn off power to the resistance to be measured and discharge any capacitors. Any voltage present during a resistance measurement will cause inaccurate readings”. Ok, it would have helped if I’d thought of searching for the manual before.

Under the specifications section the DC voltage specifications says “Input Impedance: 1M ohm”. For this tester is the ohms per volt value you described the same as the input impedance at 1M ohms per volt?
 
Under the specifications section the DC voltage specifications says “Input Impedance: 1M ohm”. For this tester is the ohms per volt value you described the same as the input impedance at 1M ohms per volt?

For all intents and purposes as applied to DC circuits, yes. The term input impedance is most often applied to AC or mixed-signal readings. When dealing with DC circuits, the input impedance is basically the same as the maximum DC sensitivity. As applied to self-powered meters (VTVM's, TVOM's, etc.), the spec is given as an input impedance rather than a DC sensitivity. This type of meter has almost no loading effect on the circuit under test. In addition, the circuit loading of most VTVM's and TVOM's will remain constant regardless of the voltage range in use, which is why you don't see a "per volt" caveat on the spec. :)
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top