Hello, I have a slight problem using an ultrasonic transducer. Specifically, in a circuit with two of those, one being the emmiter and one the receiver, the amount of time it takes between the initial 40khz pulse from the transmitter and some voltage being generated in the receiver is too great.
For example, when they're set at 30cm from one another, the delay should be around 800us. Instead, I see a delay of about 1ms, which is significant. And yes, I've compensated for the temperature difference and made a bunch of tests at different distances, the delta t is still too long.
Is this normal behaviour on these ultrasonic transducers? I have no datasheet for them and the only thing I can think right now which could be responsible for this behaviour is the inertia of the transducer. If so, I should be able to compensate for it, subtract a constant amount of time from the result each time a measurement is taken.
Any thoughts?
For example, when they're set at 30cm from one another, the delay should be around 800us. Instead, I see a delay of about 1ms, which is significant. And yes, I've compensated for the temperature difference and made a bunch of tests at different distances, the delta t is still too long.
Is this normal behaviour on these ultrasonic transducers? I have no datasheet for them and the only thing I can think right now which could be responsible for this behaviour is the inertia of the transducer. If so, I should be able to compensate for it, subtract a constant amount of time from the result each time a measurement is taken.
Any thoughts?