maybe i should do this as a poll... but here's an explanation followed by the question...
most test equipment has dials where amplitude, frequency, or time goes in steps. for test equipment based on Tektronix conventions, this is in the form of "1, 2, 5, 10"... so for instance the voltage ranges of an oscope would step "1mV, 2mV, 5mV, 10mV..." per centimeter. on test equipment based on Hewlett Packard conventions, the decades go in steps of "1, 3, 5, 10"... then occasionally i see a "compromise" convention of "1. 2.5, 5, 10". the thing is, for instance on an oscope or other piece of test equipment, you learn to recognize what changing a setting will do to what you see on the display almost instinctively, and switching from a Tek to an HP scope comes with a short learning curve. so, my question would be in a few parts
a) which are you most familiar with?
b) which do you have the most trouble with?
c) which makes the most sense to you?
d) does it not matter as long as you can figure it out?
i realize today a lot of test equipment uses a lot of different conventions and people likely have test equipment with all of the above or even direct entry of the settings from a keypad, but at the very least i would find the results interesting with respect to analog test equipment where these "decades" are hard wired and are often overlooked when selecting a piece of test equipment. just a question that's been on my mind since i was a calibration tech in the Army...
most test equipment has dials where amplitude, frequency, or time goes in steps. for test equipment based on Tektronix conventions, this is in the form of "1, 2, 5, 10"... so for instance the voltage ranges of an oscope would step "1mV, 2mV, 5mV, 10mV..." per centimeter. on test equipment based on Hewlett Packard conventions, the decades go in steps of "1, 3, 5, 10"... then occasionally i see a "compromise" convention of "1. 2.5, 5, 10". the thing is, for instance on an oscope or other piece of test equipment, you learn to recognize what changing a setting will do to what you see on the display almost instinctively, and switching from a Tek to an HP scope comes with a short learning curve. so, my question would be in a few parts
a) which are you most familiar with?
b) which do you have the most trouble with?
c) which makes the most sense to you?
d) does it not matter as long as you can figure it out?
i realize today a lot of test equipment uses a lot of different conventions and people likely have test equipment with all of the above or even direct entry of the settings from a keypad, but at the very least i would find the results interesting with respect to analog test equipment where these "decades" are hard wired and are often overlooked when selecting a piece of test equipment. just a question that's been on my mind since i was a calibration tech in the Army...