Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

The 'blind' navigator robot

Status
Not open for further replies.

stik

New Member
Hi all

I've just been given an awesome electronics project. Here's a brief description of what the project requires:

A navigation system must be designed to guide a toy tank along a predetermined path. The only references that the robot will have are two orthogonally placed sound sources, of 600 Hz and 1200 Hz. Using these references, the robot will have to be able to determine its current position and guide itself along a known set of waypoints.

With only two microphones and no moving parts, a suitable positioning system must be designed to determine the location of the robot. This system will be interfaced to the existing tank body. The robot will be first initialised by placing it at a known position. Then, using the sound references only, off it must go round the course!

I was wondering if anyone had some cool ideas of possible methods of using the sound sources to navigate.

THANKS! :)
 
I have thought of calculating the phase difference with respect to each speaker at each location by measuring the time difference between the two microphones. By measuring the time difference, and knowing the speed of sound, you can calculate the distance between the 2 microphones with respect to the speaker and hence calculate the 'azimuth' angle. (similar to how humans localise sound)

However, I think this method might be great with one sound source, but having two different sources may be a bit of a problem I feel. The constructive and destructive interference pattern may deem it very difficult to do this.
So I'm thinking of maybe determining a grid pattern of constructive and destructive interference, based on the wavelengths of the 2 sources. (i've attached images that i generated on MATLAB of what i think this may look like). By detecting how many points of constructive/destructive interference you detect, you can calculate how far to travel. This too I think will be very difficult because I have no idea how to visualise the interference pattern, or whether the 2 sources will be in phase or the same amplitude etc.

What were you thinking? Any ideas at this point would be immensely appreciated! :)
 

Attachments

  • interferencepattern.PNG
    interferencepattern.PNG
    24.6 KB · Views: 354
  • interferencepattern1.PNG
    interferencepattern1.PNG
    29.1 KB · Views: 329
How big is the robot operating field? Phase measurements would work if the frequencies are low enough that the high frequency would cover the distance of the field in one cycle. At 1200hz one wave covers about 0.8 feet before repeating. Beyond that distance the phase relationship repeats. This approach may require a precision clock--to the microsecond--shared by robot and beacon.

Triangulation is possible using highly directional mikes. Look at this article. Employ sound instead of lasers:



Imagine a mike at the end of a sound deadening tube. The mike is rotated by a servo motor. It rotates until the received sound is the loudest. That gives the relative bearing to the sound beacon. Repeat for each beacon. Sound reflections from other parts of the room could give false bearings. So only sound readings exceeding a certain intensity may be required to treat the bearing as "true."
 
Last edited:
Thanks so much. I'll take a look... Unfortunately, we are not allowed to move the microphones once they are attached to the robot. I suppose I could probably just turn the tank on a point though in order to survey the area.

The robot's operating field is a 4m x 4m square. I've attached a diagram (not to exact scale) of the positions of the speaker with respect to the operating area. The robot will begin at the known position, the 'origin'. We have been told that this will be exactly in the middle of the operating area, and that the robot will initially be directly facing a speaker. The robot will have to calibrate itself to this known position. Waypoints will be provided on the testing day, and will be given as coordinates (in metres) with respect to the origin e.g. 1m upwards is (0,1).

I am just very confused as to whether I will in fact be able to calculate phase difference with all the constructive and destructive interference occuring in the region, since the sources are an 'octave' difference in frequency. Do you think by using very narrow bandpass filters that I will be able to differentiate between the two frequencies? I am just uncertain, for example, as to whether certain areas will be 'dead spots' due to destructive interference and nothing will be heard. Will the interference pattern due to the two sources be constantly changing? :confused:

Also, do you have any suggestions as to how I can measure distance? I have been told that using amplitude is not a good idea, I am not sure why... Perhaps I should just use timing of the motors and calculate with respect to the origin?
 

Attachments

  • operatingarea.PNG
    operatingarea.PNG
    11.8 KB · Views: 488
I presume this is a course project. What is the course title and what subjects have you covered pertinent to the task at hand?

I'm trying to get a reference for the solutions based on covered material.

Based on the last post I might be looking at a single microphone and a compass. Two LM567 tone decoders could be set to the two frequencies. The output state could be sent to a microcontroller, which could tell which frequency was being received. This assumes sufficient period between each frequency for the 567 to lock and change state.

Without the ability to rotate the microphone the tank has to be able to turn on its center. Sideways motion between the two signals will introduce error in the position reading.

Your project is not trivial.
 
Yup, I realise that. They sorta always throw us in the deep end with our projects :p

This is a third year electrical engineering subject called Electrical Engineering Design. Assumed prior knowledge is all of my 1st and 2nd year subjects. This course is pretty much self-guided, and we are tasked with finding and implementing our own solution to the problem.

In your idea, how will the tone detectors be able to provide information regarding phase (or other information perhaps, regarding location)? I'm assuming at the moment that the tank will be able to rotate on the spot, however I have not received the tank as yet. We are still in the 'paper design' phase.

Unfortunately, a requirement of the project is that we use 2 mic's in our design. The speakers will also operate independent of the robot, so there is no way to closely monitor timing between the robot and the speakers. All the robot will be able to do is detect sound from each source and use that information to navigate...

Thanks for your help!
 
The devil is in the details. I presume you received a task description that was a written page long. There must be more detail than you are sharing with us.

I'd like to see the task description.
 
Two mikes, angled 90 degrees apart, and just compare the signal levels from them - rotate the robot until the levels are the same.

This is pretty well how your ears work.
 
Last edited:
Hi Nigel,

I know from physics, by looking at the inverse square law for sound attenuation over distance that for a point source, the intensity at a point away from the source can be given by:

I = Io/(4*pi*r^2)

If you look at the attenuation in dB, this equates to about -4dB/metre. However, this is assuming that the source is radiating in a sphere all the way around it. Since the sources will be speakers, and assuming that the 'cone' in which they radiate is below 45 degrees, the attenuation will most probably be less than -1dB/metre. I'm not sure, but do you think that that kind of differentiation between intensities will be enough? Will it not lead to the possibility of increased errors given that the operating area is 4m x 4m?

Thanks very much for your idea! Looking forward to your reply!
 
I havent come across a double-beam scope before? Could you give me more details as to what it is? Do you mean an oscilloscope? Thanks
 
Last edited:
Two mikes, angled 90 degrees apart, and just compare the signal levels from them - rotate the robot until the levels are the same.

adding demodulators might help imho .. one tuned to 600Hz and other to 1200Hz.. getting the same peak value for both on 90 degrees angled mics where on is 600 and other 1200 demodulated should be accurate to find proper orientation, but determining the distance from the source using only amplitude on 4x4m is imho hardly possible. so if the goal is to follow predefined "path" I'm not sure amplitude will work ... To be honest, I am pretty sure it will not work as the frequency is very low. If you check the audio books, humans are not really able to distinguish location of the low sounds, while high ones we locate easily... it is not only how our ears are designed, low frequencies travel further, lose energy slower and avoid obstacles pretty easy... (in contrast to high frequencies that lose energy easy)


on the other hand, 600 and 1200Hz are 1:2 ... are low enough for peak detection (or zero crossing) using fairly cheap electronics, the time between peaks is "measurable" using entry level uC's ... I'd not say this is done by accident :) so I believe the initial direction of overlapping patterns is the intended way for the course .. but some additional questions need answers... the "plato" is 4x4m, how big the "tank" is (how far apart the mic's can be), where the speakers are going to be (how far from the "plato", what direction, how far apart, what direction)... what is going to be "plato" surroundings?

In general you should be able to measure overlay of the both signals in 2 points (2 mics) and locate your "exact" position on "plato" as well as your direction. you will have "same/similar" measurements in few spots on the "plato" but knowing where you are at the beginning of the "trip" you can calculate the correct path
 
Hi arhi,

Thanks very much for your idea! Sounds like a good plan. I have attached a diagram in a previous post of the robot operating area. We have been told that the speakers will be exactly orthogonally placed. The testing will be done in a large hall, so echoes (as well as my fellow classmates chatter :p) may be a problem.

Does your method use the timing between peaks detected by each microphone to calculate the angle with respect to each speaker/frequency source individually (like the figure i posted?)? I am a little confused as to whether the interference pattern created by the orthogonally placed speakers will make it difficult to detect peaks? Will filtering using narow bandpass filters be sufficient? I have not been given any information as to the dimensions of the tank, but I assume that it will be big enough to place the mic's as far apart as needed....
 

Attachments

  • phasecalc.PNG
    phasecalc.PNG
    30.9 KB · Views: 267
Last edited:
Your point-in-space calculations will require at least a Pythagorean theorem or trigonometric functions. So you need a micro controller or processor that can perform real number calculations. Consider some of the C+ systems that can be used in a microcontroller with appropriate function libraries. If a feedback control system is incorporated computing power needs to be considered

The operating area is large enough that “slippage” during track turns should provide little error to your location measurements.

I see two possibilities for the directional measurement:

First: You could use two microphones, pointed in the same direction. They need to be close enough that the signal phase differences allow the robot to be turned until both microphones share the same phase. The phase would probably be determined by comparing the signals received by each mike at the same time. When both signals are the same, the mikes are pointing directly at the sound source. This requires mikes that have very similar signals out for a given signal level. A microcontroller, operating at megahertz speeds. will not see a sine wave, only the signal intensity at the instant the sound is read.

Second: The two microphones could be mounted close together, in the same direction, each with a directional sound dampening tube that provides the highest measured signal when the microphones are pointed directly at the sound source. The microphone pair provide differential reading. When both report the same signal level, the sound source is directly ahead. I’d be inclined to try this approach.

Either approach requires readings for each of the speakers in order to triangulate position. Since the mikes can’t move the robot has to rotate, which suggests electronic compass readings. I wouldn’t depend on odometry to measure rotational change on a tracked vehicle.

Consider the issue of reflected sound, noise. The mikes will receive sounds bounced off walls and other objects. Only consider signals that exceed a minimum threshold level.

I’m confused because the task permits “additional sensors, etc.,” but “no moving parts.” Odometry might be a second measurement routine to consider. I’d propose placing an idler wheel under the center of the robot. This wheel would report distance and direction travelled. Perhaps a wheeless computer mouse (no moving parts) could be used.
 
Last edited:
I just wrote what seamed possible to me - mostly because 600Hz and 1200Hz are low enough to detect peaks and 1:2 and I do not believe that is "coincidence" ... 2 channel scope (one beam per channel is good idea :) - any DSO should do too) and a simple test should give you much more ideas then what I can come up in short time :) ...

the low frequencies do not lose energy that easily so on 4x4m "plato" I do not believe you can use amplitude for measuring distance. you need to analyze intersection of the data on those 2 points .. not easy for me to visualise attm and I do not have time to setup the test env to check the results using scope.
 
arhi,

I think your amplitudes concept could have merit. It does involve some variables:

1. Are the amplitudes consistent over time?

2. Are the amplitudes consistent as the angle sweeps either side of the speaker centerline? If not, then range calculations can't be determined without measuring bearing to the speaker.

3. Can builders place robots in the test area beforehand to actually measure the sound intensity at various positions in the test area?

If sound intensity is a function of distance from the speaker, then each sound level creates an arc of fixed distance from each speaker. The robot is where the two arcs cross.
 
I'm maybe completely off the chart and I'd really like to c the signal and do some measurements before I ramble about things like that .. especially as analog stuff are completely out of my area of interest ... anyhow .. rambling continues... if you look at the wave going from the speaker, you can assume some radial propagation of the wave ... that said, and that "visualised", the actual frequency of the sound should be slightly different with regards to the angle towards speaker...

looking straight at the sound you should count peaks at the frequency of the sent sound
Code:
-----+----------------------------------
     O
     O
-----+----------------------------------

on the other hand, being under angle
Code:
----+-----------------------------------
      O
         O
-----------+----------------------------

now, after writing this, I'm doubting it but check this out .. the wavelength of the 600Hz is ~0.56m and for 1200Hz it is ~0.28m so it actually makes sense, as the wavelength is so "big" you can use the difference in time between peaks to measure "angle". You can also use Doppler to measure your speed going towards the sound source.

In general, to be able to walk the path with precision
- you need to know your heading
- you need to know your speed

with those 2 informations you can calculate where you are at any moment in time relative to starting position.

Now, the math gets bit complex, you need to use 2 sources ... they compensate each other and give you twice the resolution of only one + it allows you to have accurate measurement in "any direction" as with single sound source you would not be able to do any measurements going parallel to the source and you would have to sail like a sailboat (40 degrees one side, 40 degrees other side) in order to get where you wanted.
 
@bobledoux, forget about using the amplitude intensity for any valuable info... the interested point is the amplitude peak in time, but the actual intensity is irrelevant, in 4x4m "plato" you have 4 (8) peaks in the same time so even if you could have "loss of amplitude intensity over distance" there's just not enough info for you to get any precision. Other problem is that low frequencies loose energy very slow, there will be no measurable difference in amplitude intensity of the wave in the 4x4 "plato"
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top