Morganrich1
New Member
If I were using a 1GHz processor, is it possible to program a timer accurate to one nanosecond? Likewise, could a 2GHz processor be accurate to .5 a nanosecond?
I want to measure distance using a timestamp sent from point A to point B via radio waves. I wanted to do this with and an Arduino however it is only accurate to +/- 4 microseconds.
If the speed of light is 186,000 miles per second and I am measuring in microseconds. (.000001*186,000)=.186 miles, or 982.08 feet. If my margin of error is +/- 4 microseconds, wouldn’t my accuracy only be good to
(982.08*4) or +/- 3,928.32 feet?
I want to measure distance using a timestamp sent from point A to point B via radio waves. I wanted to do this with and an Arduino however it is only accurate to +/- 4 microseconds.
If the speed of light is 186,000 miles per second and I am measuring in microseconds. (.000001*186,000)=.186 miles, or 982.08 feet. If my margin of error is +/- 4 microseconds, wouldn’t my accuracy only be good to
(982.08*4) or +/- 3,928.32 feet?