Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Ultrasonic Distance measurement using Phase Shift Detection help (with atmega32)

Status
Not open for further replies.

kris_maher

New Member
Hi,

I'm building an Ultrasonic Rangefinder, it transmits a amplified square wave at 40KHz, and receives (via receiver circuit) a weakened sine wave which is then filtered, amplified and converted to a square wave again before being sent back to the microcontroller for processing.

I initially tried programming with via measuring the echo (delay) between the signals however perhaps it was my coding that might have had a glitch.

Anyway I've decided to check out "Phase Shift Detection" method for distance measurement. I would need to measure the phase difference between the square waves via sampling using ADC.

Just wondering if someone can point me in the right direction thanks since I've never done anything related to phase shift before (except only in engineering maths). Also the return signal would be 40KHz as well and also I'm not using any other hardware such for phase shifting, I'm expecting to do it all in software once the received signal is taken in by the micro. Oh and btw I'm using an external 16MHz crystal.

Thanks again,

PS: My code already transmits a continuous square wave via CTC mode using interrupts at 40KHz on timer 1.

PS2: I have this formula though which I've found from some research though I'm not sure if it's correct or not for my job.
Phase Shift = (2*pi*f*L) / C.

For C = speed of sound (taken as 340.29m/s), L = distance of object
 
Last edited:
It's good to see that you've made a beginning. Here are some questions that may help you make some design choices.

40kHz has a period of 25 microseconds. How frequently will your ADC need to sample in order to get the phase resolution that you want? Is it fast enough?

How far does the sound travel in 25 microseconds (one cycle of 40kHz)?
 
My lecturer suggested I sample 100 before working out the distance.

It's a 10-bit ADC so it should be fine. My lecturer gave me this chip (atmega32) so it should work with his suggested value of 100.
 
Last edited:
You need to check your math once gain. If sound travels 340.29 meters in a second, it travels 340.29 millimeters in a millisecond. Finally, 340.29 micrometers per microsecond.

25µs/cycle * 340.29µm/µs = 8507.25µm or 8.5 millimeters (1/4 inch) per cycle.

The ATMega32 ADC requires 13 cycles to complete a conversion. With an ADC clock of 1MHz, the ATmega32 takes each 10-bit value in 13µs.

You do know that your speed of sound (340.29m/s) is not good to five digits? Over a normal day's temperature range it can vary by at least 2%. It's also influenced by humidity and barometric pressure.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top