test equipment "decades"... which do you find the most useful?

Status
Not open for further replies.

unclejed613

Well-Known Member
Most Helpful Member
maybe i should do this as a poll... but here's an explanation followed by the question...
most test equipment has dials where amplitude, frequency, or time goes in steps. for test equipment based on Tektronix conventions, this is in the form of "1, 2, 5, 10"... so for instance the voltage ranges of an oscope would step "1mV, 2mV, 5mV, 10mV..." per centimeter. on test equipment based on Hewlett Packard conventions, the decades go in steps of "1, 3, 5, 10"... then occasionally i see a "compromise" convention of "1. 2.5, 5, 10". the thing is, for instance on an oscope or other piece of test equipment, you learn to recognize what changing a setting will do to what you see on the display almost instinctively, and switching from a Tek to an HP scope comes with a short learning curve. so, my question would be in a few parts
a) which are you most familiar with?
b) which do you have the most trouble with?
c) which makes the most sense to you?
d) does it not matter as long as you can figure it out?

i realize today a lot of test equipment uses a lot of different conventions and people likely have test equipment with all of the above or even direct entry of the settings from a keypad, but at the very least i would find the results interesting with respect to analog test equipment where these "decades" are hard wired and are often overlooked when selecting a piece of test equipment. just a question that's been on my mind since i was a calibration tech in the Army...
 
I'd go with 1 - 2 - 5 - 10

There is always going to be an uneven step, trying to match approximated powers of two to decades, but I think that's the most convenient. It's also the set of steps generally used for money, so most peoples minds are used to it.
 
An interesting question.

I think the answer is "It all depends on the application".

On an oscilloscope, like this ancient Telequipment D75 from the 1970s,


The 1, 2, 5, 10 sequence is good.
When measurements are made by interpolating the position of the trace in a graticule with a 1cm (usually) grid, knowing that the size of the grid is 1, 2 or 5 volts, makes mental calculations easier for those of us who are "mental arithmetic challenged".
If the 1cm graticule represented 2.5 or 3 volts, life can get difficult.

However where meters are involved, the 1, 3, 10 sequence has advantages.
Look at this Hewlett Packard signal generator, the output level switch and the meter.

,

The switch is ordered in a 1, 3, 10 sequence, as is the meter.
But look closely at the 0 - 3 meter scale, full scale is actually has 3.2 corresponding to 10 on the 0 - 10 scale.

The reason is the dBm scale.
Note that the outer edge of the switch is calibrated is steps of 10 dBm.
Doing the maths
0dBm in 50 Ohms is 0.224 volts
+10 dBm in 50 Ohms is 0.71 volts
-10 dBm in 50 Ohms is 0.071 volts.
Note that on the meter scales 0.224 on the 0 - 3 scale, lines up with 7.1 on the 0 - 10 scale, and so the -10 to +3 dBm scale always reads correctly.

Now look at the meter and switch on this Marconi Instruments Distortion Factor Meter,



Again there is a 1, 3, 10 sequence to the ranges
Like the signal generator, the 3 scale is actually a 3.2 scale.
Also, there is a dBm scale on the meter and the switch,
but this time 0 dBm corresponds to 0.775v
+10dBm corresponds to 2.45v
and -10 dBm corresponds to 0.245v

These values are quite different from the scales on the signal generator.
The reason is that this item is intended for audio applications, not RF applications.
RF people work in 50 Ohm circuits (usually) and AF people work in 600 Ohm circuits (usually). There are exceptions in both cases.

So, as I said earlier, it all depends on the application.

JimB
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…