Continue to Site

# Nanosecond Timer

Status
Not open for further replies.

#### Morganrich1

##### New Member
If I were using a 1GHz processor, is it possible to program a timer accurate to one nanosecond? Likewise, could a 2GHz processor be accurate to .5 a nanosecond?

I want to measure distance using a timestamp sent from point A to point B via radio waves. I wanted to do this with and an Arduino however it is only accurate to +/- 4 microseconds.

If the speed of light is 186,000 miles per second and I am measuring in microseconds. (.000001*186,000)=.186 miles, or 982.08 feet. If my margin of error is +/- 4 microseconds, wouldn’t my accuracy only be good to
(982.08*4) or +/- 3,928.32 feet?

If I were using a 1GHz processor, is it possible to program a timer accurate to one nanosecond? Likewise, could a 2GHz processor be accurate to .5 a nanosecond?

I want to measure distance using a timestamp sent from point A to point B via radio waves. I wanted to do this with and an Arduino however it is only accurate to +/- 4 microseconds.

If the speed of light is 186,000 miles per second and I am measuring in microseconds. (.000001*186,000)=.186 miles, or 982.08 feet. If my margin of error is +/- 4 microseconds, wouldn’t my accuracy only be good to
(982.08*4) or +/- 3,928.32 feet?

Ok well, a few more questions then. If I were measuring in microseconds, if I took a few hundred thousand calculations a second and averaged the results, would I still be +/- 3,928.32? Seems like a stupid question I know, is there a work around is what I'm getting at.

Does a 1GHz processor, like are found in many home computers, work the same as say an ATmega328? Could I burn the Arduino boot loader on it and be up and running in the Arduino environment?

As far as I'm aware, you need hardware (and VERY fast hardware) to measure distance using light or radio waves.

What if I am not using the radio wave itself, what if I am using the data it carries. Let’s say I have two clocks in sync. Clock A sends a message to clock B at 1:00:00 and clock B receives it at 1:00:01, if the message is sent via radio wave would I not know that clock A and clock B are 186,000 miles apart?

But since the distance I am measuring is a few hundred feet at most, I do not have the luxury of seconds. This is what has led me to using a 1 or 2 GHz CPU as opposed to an ATmega328, though I'm not sure if my theory is sound….or even if I can put it all together for that matter. Which is also why I wonder if there is a workaround, or another easer method to measure distance very accurately.

Thanks by the way for your thoughts.

Last edited:
I used to work on a similar system measuring the return speed of light.

We set up a constant current source and gated it on and off into a capacitor. The charging will be constant over time from:

i = cdv/dv

Once you have the analogue voltage, measure it and it will be proportional to the time interval between the On and Off times.

From memory, we used to use a long tailed pair with pnp transistors with the capacitor in one of the collectors and switched the current between them.

Hope this helps

Hmmmm...well as usual what would seem simple is not at all. If microseconds is the best that can be done, could I fake nano resolution by sending the same data 1000 times, then devide the time it takes to send it all by 1000 for a mesurment in nanos? or would I still be +/- 3,928 feet?

No more than you could measure a 1" distance with a yardstick 36 times to get 1" resolution.

Ok so just to be clear, using a 1 or 2 GHz processor cannot power a clock or timer with nanosecond resolution? It would seem that this would be a common problem – is there really no common answer? Is there no way to do this with software provided it has the processing power to back it up?

If the answer is no it can’t be done without a team of engineers and NASA’s budget the next obvious question is how else can it be done. Are sound waves predictable enough? Besides sound waves is there anything slower than light but as predictable, something where microseconds would work – milliseconds would be even better. Could passive sonar be a possible solution?

I want to measure distance using a timestamp sent from point A to point B via radio waves.

Whats the distance from point A to B.??

Also are you wanting to measure just the distance from A to B, or a reflected signal time A > B > A.?

Sound waves are much much much slower than light.

The distance will be no more than a few hundred feet. All I want is the distance from A to B, the only reason I want accurate signal time is to back into the distance.

I don’t really care if the signal is reflected or not, however it needs to be done to get an accurate distance measurement. Initially I thought one way would be easier to deal with, but time resolution is posing a real problem.

Yes they are, question is can they reliably used to A) Transmit data OR B) Ping for distance. Would working with sound even be any easer?

Last edited:
The distance will be no more than a few hundred feet. All I want is the distance from A to B, the only reason I want accurate signal time is to back into the distance.

I don’t really care if the signal is reflected or not, however it needs to be done to get an accurate distance measurement. Initially I thought one way would be easier to deal with, but time resolution is posing a real problem.

If you consider trying to measure the distance from A to B only, thats not possible using radio or laser.
You have no time datum point from which to measure.!

Well you could have time data to measure. If clock A and B are synchronized, clock A sends a message to clock B via radio telling clock B what time it is. The difference between what time Clock A says it is and what clock B knows it is equals the time it took for the message to arrive. Which is why the whole deal needs to be in nanoseconds, hell picoseconds would be even better.

I believe a 1GHz processor can complete one machine cycle per nanosecond, is this to say that it could power a clock with a resolution of one nanosecond? I don’t know if that in and of itself is true or not – or if there is programming available for it.

Ok so just to be clear, using a 1 or 2 GHz processor cannot power a clock or timer with nanosecond resolution? It would seem that this would be a common problem – is there really no common answer? Is there no way to do this with software provided it has the processing power to back it up?

If the answer is no it can’t be done without a team of engineers and NASA’s budget the next obvious question is how else can it be done. Are sound waves predictable enough? Besides sound waves is there anything slower than light but as predictable, something where microseconds would work – milliseconds would be even better. Could passive sonar be a possible solution?
A 1 or 2GHz counter (not a microprocessor) can have a resolution of 1ns or less. Of course you need fast pulse circuitry to transmit the pulse and fast analog circuitry to detect and resolve the return pulse with ns resolution also.

Digital processing can only do limited things, such as averaging or filtering a number of the return pulses to reduce noise or jitter, but it can't improve the basic resolution of the signal.

An active (not passive) sonar system could possible work, but it's difficult to focus the transmitted signal and resolve small objects since the wavelength of sound is much longer than light, unless the frequency is in the high ultrasonic. Sound only travels about 1000ft/sec so the circuitry doesn't have to be very fast to obtain reasonable resolution.

You choice is between sound and light. There's no other type of wave available for use.

I'm not aware of any optical sensor / LED that works at 1GHz.

Carl

Is there any prepackaged solution that you know of, a plug and play if you will?

Google "laser rangefinder" or 'laser distance" for many hits. Perhaps one of those would work for you.

Status
Not open for further replies.

Replies
2
Views
1K
Replies
3
Views
3K
Replies
3
Views
1K
Replies
34
Views
11K
Replies
16
Views
3K