Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Nanosecond Timer

Status
Not open for further replies.
Use the digital equivalent of the old rope and pulley system.

Say I have a string 1000 feet long with a knot tied in it every 100 feet and I need to measure something reasonably accurately thats 10 feet long with it. My measuring resolution is only 100 feet so knot to knot obviously wont work. However if I use two gangs of pulleys that wrap the string around them 100 times back and forth They will move exactly one foot closer for every 100 feet of string pulled off of them. Or one foot further for every 100 feet of string added to them. Now my 100 foot marks indirectly measure 1 foot increments accurately.

See the method yet? Send out a timing reference mark (knot). Send it to the receiver then send it back to the transmitter then send it back to the receiver again and do this 100 times. When the small variation in distance has been multiplied over and exact count of 100 loops and then re referenced to the master clock as to how far off it is when its been through so many loops and you have a calculable measurement of a small distance with a long measuring device.

Granted you will need to calibrate it probably and have to average the slight variations in how the signal is received and then retransmitted at each end but if done enough times, perhaps thousands or more, and averaged out properly you can in fact get very small measurements with out using very small measuring devices.

Just a theory. :)
 
Last edited:
This is how CDMA based systems make distance measurement. GPS uses the chip rate code word correlation to measure the relative delay beween multiple signals.

The modulation envelope must have a raw bit rate (time per bit) short enough to get the desired resolution. For nanosecond resolution you need a very high bit rate and corresponding RF bandwidth. An Ultra-wideband WLAN system would be close to this.

And this requires dedicated hardware.
 
...
See the method yet? Send out a timing reference mark (knot). Send it to the receiver then send it back to the transmitter then send it back to the receiver again and do this 100 times. When the small variation in distance has been multiplied over and exact count of 100 loops and then re referenced to the master clock as to how far off it is when its been through so many loops and you have a calculable measurement of a small distance with a long measuring device.
...

That's very clever! :)

But here's another system that doesn't need bidirectional comms.

Just incorporate a random (or ramp) variation in the timing of either transmitter or receiver. So if the receiver has a timing resolution of 1uS, add a 0-1uS variable "handicap" before it starts timing. Then receive the signal 1000 times etc, and add all the times. Now you have timing resolution to 1nS.

To add the random or ramping factor you could use the beat freq difference between the 2 xtal clock speeds of the transmitter and the receiver.
 
To give students a bit of a grasp on the speed of light, I approximate it, telling them that light travels about one foot in one nanosecond. Fast digital circuitry can handle a lot of that -- it's often the sensors that're the slow part that keeps you from "simple" measurements.

Dean
 
Last edited:
You could put two transmitters a set distance apart and measure the angle between them from the receiver.

You could put a GPS unit in each side and calculate the distance using the positions.

How about a 100 foot measuring tape?
 
Picking up on Bill Naylor's post, I have been playing around with an idea using capacitor discharge rate for timing. Step 1: Charge two identical RC circuits to a predetermined potential (in parallel?). Step 2: with fast swiches and or logic gates arrange for the first RC circuit begin its discharge when a start signal is received. Step 3: Have the second RC circuit begin discharging in series with first RC circuit. Step 4: detect the peak voltage to determine the elapsed time. If the voltage monitoring is restricted the first "e" the decending slope of the first RC circuit should remain steep enough to pick off the peak voltage with a peak detector or an ADC with a resolution appropriate for the application. Does this make any sense and, if so, where can I find switches and logic gates that can handle this resolution?

Thanks,
Bob Agnes
 
Hi minglemind

I dont fully understand what you are trying to do. Can you submit a schematic please?

I would not use RC circuits. The tolerances of the RC circuit (especially the capacitors) will mean that your results could be wildly out. If you use 2 RC circuits, this will make things doubly worse.

What timeframe are you trying to measure?

Please let me know

Thanks
 
Polaroid used to make a very simple device with an ultrasonic transducer and a digital counting timer. the device would send out a ping, count up until the return pulse was detected. once the return pulse was detected, the count was stopped, and passed to a motor driver circuit, which would step the motor that number of steps. the motor was the focus mechanism for a camera. the count was accurate to within a few inches. i was one of the techs that built the automated test fixture for the camera electronics under contract.


the problem with measuring in nanoseconds is that you need an RTOS (real time operating system) do do the software part of it, and you have to take into account the propagation delays of the signals through the various parts of the electronics.


you could measure in 10 nanosecond increments with a 100Mhz clock, 5 nanosecond increments with a 200Mhz clock or 2 nanoseconds increments with a 500Mhz clock, but the higher the clock frequency, the more propagation delays within the electronics skews the accuracy of the measurements
 
Last edited:
Hi to all,

Myself Sumanta is facing a problem regarding the implementation of a logic in a Microcontroller. The problem is as follows -

Suppose I have two ports name PORT1 and PORT2. Now the ports status will change by the following way:-


PORT1 PORT2 COMMENT

5 V or Logic1 5 V or Logic1 Initial condition

0 V or Logic0 5 V or Logic1 Timer Start

0 V or Logic0 0 V or Logic0 Timer Stop


Now the time gap between timer start and timer stop will be in nano-second range. I want to display this time delay on a display device. This will be done one time.

It will not be done in a continuous manner. If need another experiment will perform manually by pressing power on/off switch or pressing reset button. Please give me

your suggestions regarding the problem.

Thanks in advance.


Email Id - (show.sumanta@rediffmail.com / sumanta.show@gmail.com)

Mobile No.- (9932582375)
 
Last edited:
@ shom-show
You're more likely to get a reply if you start your own thread instead of hi-jacking someone else's (which is bad etiquette).
 
first off... I am a electronic design engineer as such a bit jaded when it comes to hobbyist caliber parts..

that said, it is actually quite easy to get to 10nS if you get a 100MHz reference and a counter... the easy way appears to be an LPC1751. Off hand I do not see that the counter can not run at the 100MHz top CCLK rate.

if it cant just put an external 4 bit counter out front. as a bonus the PLL instability might just be enough to be able to give you better resolution by averaging.

BTW you do get better resolution by averaging out noise by the square root of the number of samples averaged.

Dan
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top