Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

mm precision triangulation/multilateration using RF ?

Status
Not open for further replies.

OsRos

New Member
Hi everybody,

I am more of a software guy (so please be patient if I don't make sense ) but this piece is something that I am really interested as a component of a larger project.

I am trying to locate an object using RF with milimetric precision within a range of 40-50 m ( indoors ). The object itself would be a transmitter and communicating with multiple receivers in the room to determine its own position. I have been looking at:

https://en.wikipedia.org/wiki/Trilateration
https://en.wikipedia.org/wiki/Multilateration

My concern is mostly if its possible to measure the different delays with enough precision at such short distances given that RF travels at such high speed. Both the range and the precision are paramount to the project and it needs to be safe for people and fairly unintrusive. Would ultrasound be a better choice ?

I would really appreciate some help. In fact Ideally ( since this is not my area of expertise ) I would be very interested in working with somebody as a consultant/tuttor.

I am located in Toronto, so geographic proximity would be preferable but I could also work with somebody remotely over the web.

Hope somebody can give me a hand...
 
Some thoughts -

A. Whether ultrasound or RF, there will be some reflection or blockage of sound/RF by things in the environment. Not sure what the intended environment is like.

B. If you plan to use RF you might first determine what frequencies and power are allowed. Keep in mind that you'll likely be sharing frequencies with others - you can probably deal with that but it adds to the complexity.

C. You might consider interferometry as a way to determine increments of distance as things change. With proper antennas it would seem that signal strength would rise and fall at one wavelength increments. Higher frequency is shorter wavelength therefore greater resolution.

Sounds interesting - probably a lot of work and probably something others have tried.
 
Back of the envelope calculation:
Code:
Speed of light ~ 1 foot/nanosecond
millimeters per foot ~ 300
3.3333.... picoseconds per millimeter
I'm pretty sure you can't build any kind of logic circuit or processor circuit that can measure such small time differences. I don't know where the requirements came from but I think you are several orders of magnitude away from being able to make it work.
 
Thanks for your comments,

I understand that it would be very difficult to measure the time differences. Here is something that I am wondering if it would be possible:

Could several trips be added together to allow for a higher number of counts? Not averaging the trips but adding them and only resetting the counter say when trip n is reached?

Each trip would have a delay component associated with the distance the wave travelled. If say 500 trips are added they would form a greater delay, one that could potentially be measured by a mcu?

The sum of delays for travelling 1cm would be different from travelling say 2 cm. Therefore allowing to calculate the distance?

It might not be the most efficient way, but could it work?
 
The difference signal for resolution that fine is going to be well bellow the noise floor. In theory it could probably be done, at a cost of a some millions of dollars. Gravity Probe B is a space project designed to detect the frame dragging effect of warped space time and I think that's approaching the billion dollar funding mark for development of the technolodgies used in it. The precision required to create a device that can accuratly measure anything to 12 decimals places reliably is incredibly difficult.
 
A variation of Continuous-wave radar from wikipedia:

"Ranging can be implemented, however, through a technique known as "chirping", or frequency modulated continuous-wave radar. In this system the signal is not a continuous fixed frequency, but varies up and down over a fixed period of time. By comparing the frequency of the received signal to the one currently being sent, the difference in frequency can be accurately measured, and from that the time-of-flight can be calculated."

Because frequency is one measurement that can be made very accurately (directly related to your time base accuracy, think cesium beam oscillator) this method might be able to get the resolution you are looking for. Won't be cheap but would be proven practical tech.

Lefty
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top