Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Measuring the Speed of Light Using Infrared Pulses

Status
Not open for further replies.

Moneer81

New Member
Hello,

I am doing a project that will enable me to measure the speed of light. The way I want this to work is by sending infrared pulses from a transmitter and receiving them some distance d away with an infrared detector. I want to be able to use an oscilloscope and measure the shift between the produced pulses and the received pulses. The shift will correspond to a certain time t and dividing d/t should give us the speed of light.

Couple questions....

1. Has anyone tried this before?

2. What kind of frequencies will I need to use? Will a frequency generator do the trick or would I need to build my own circuit?

3. Due to the large value of the speed of light, would I be able to detect any shift?

Any help would be appreciated.
 
Moneer81 said:
1. Has anyone tried this before?
Yeah. It seems to be people that want to make an electromagnetic wave rangefinder (yours is pretty much the same thing but instead of calculating using speed and time to calculate distance, you use distance and time to calculate speed) but have given up on the time-of-flight method (since it's pretty obvious why it's so hard to measure something that tends to work at comparable speeds to most electronics) but still want to use electromagnetic waves. So they try and find a way around it using phase shift but it's essentially just as hard.

I haven't heard of anyone that has successfully homebuilt any EM rangefinder yet, whether it be phase-shift or Time-of-flight. And no one does parallax measurements because it's so boring and you might as just buy a Sharp IR sensor to do it for you.

**broken link removed**

How do you even detect the phase shift in light anyways? Like if we could see 1Hz electromagnetic waves, would we see something like a light-bulb fading in and out at 1Hz or what? I'm don't think you can just hook something like an detector up to an oscilloscope and see the oscillating of the propogating waves. Sure you can see the amplitude of the waves changing if the source is changing (like a light dimmer), but it's not the same as seeing the actual instantaneous magnitude of the wave oscillating. Then again, you can measure radio waves and see those on an oscilloscope so I'm pretty I'm just wrong. Maybe they're just so high frequency that I don't encounter them enough in daily life to comprehend them.
 
Last edited:
Moneer81 said:
I am doing a project that will enable me to measure the speed of light. The way I want this to work is by sending infrared pulses from a transmitter and receiving them some distance d away with an infrared detector. I want to be able to use an oscilloscope and measure the shift between the produced pulses and the received pulses. The shift will correspond to a certain time t and dividing d/t should give us the speed of light.

Couple questions....

1. Has anyone tried this before?

I've cobbled together a time domain reflectometer. That measures the length of a cable by measuring the speed of a pulse through wire. The project sounds very do-able.

Moneer81 said:
2. What kind of frequencies will I need to use? Will a frequency generator do the trick or would I need to build my own circuit?

I used frequency generator adjusted to put out 100 KHz square waves.

Moneer81 said:
3. Due to the large value of the speed of light, would I be able to detect any shift?

Connect the freq generator to the scope channel 1. Use a BNC T connector on the scope end. Connect a 100 foot coaxial cable between the oscilloscope Ch 1 and the IR or visible laser transmitter. Terminate both ends of the cable with the appropriate termination resistance. While the receiver is very close to the transmitter, observe the output of the frequency generator on Channel 1 of the scope. Set trigger to channel 1. Use chop mode, not alt. Delay the trigger so that you can see the leading edge. Channel 2 goes to the output of the reciever. Expand the waveform way, way out with the time/div knob until you can see the slope of the leading edge of the transmitter. Note the timing difference (delay) of the received square wave.

Without adding any more cable to anything, increase the distance between the transmitter and receiver by 100 ft. Again note the difference in timing. Subtract the initial time measurement from the distance time measurement. That gives you the time it takes for light to travel 100 feet in air. I think I remember that it should be ~98ns.

Wear laser safety glasses.

Bob
 
Last edited:
Take an RF amplifier.

Connect an aerial to the input and an aerial to the output.

Place the aerials a know distance apart.

It will oscillate at a frequency determined by the distance between the two aerials.

If they're 1m apart it should oscillated at 300MHz but it might oscillate at 600MHz or another harmonic.
 
Hero999 said:
Take an RF amplifier.

Connect an aerial to the input and an aerial to the output.

Place the aerials a know distance apart.

It will oscillate at a frequency determined by the distance between the two aerials.

If they're 1m apart it should oscillated at 300MHz but it might oscillate at 600MHz or another harmonic.
You need to take into account the propagation delay of the amplifier, and the propagation delay of the cables connected to the antennas.
 
Last edited:
Yeah propagation time of the cable is FAR slower than speed of light. And I believe that changes with temp too. So you'll get a reading of cable delay and won't be able to subtract the delay either because the delay has some reasons it may vary. That problem won't change with length. 100 miles would have the same problem.

Any sensor, radio tuner, or amplifier has a delay which could easily be more significant than the distance you're testing, unless the distance is very large.

Hahaha the first lesson you're learning here is how difficult it is to read the speed of light. Galileo's first attempt was to have one guy uncover a lantern and a second person would immediately open his upon seeing the latern's light, and the first guy would note the time it took from opening his to seeing the light from the second guy. Al he was able to determine was "it's very fast", while that may not seem helpful at first actually it did knock up the understanding a notch.
 
Surely you can find the cable propagation delay on the datasheet and take it into account?
 
Hero999 said:
Surely you can find the cable propagation delay on the datasheet and take it into account?
I don't think you have to. If you set up the antennas at distance D1 and measure oscillation frequency F1, then, while using the same cables, move the antenna spacing to D2 and measure F2, the speed of light C should be

C=(1/F2-1/F1)/(D2-D1),

independent of cable and amplifier prop delays.
I don't think it will be very accurate, but I believe the principle is sound.
Coax delay may vary slightly depending on the radius of curvature - I don't know this for a fact.
I also haven't thought this out thoroughly, so I may be all wet.:eek:
 
Roff said:
Coax delay may vary slightly depending on the radius of curvature - I don't know this for a fact.
I also haven't thought this out thoroughly, so I may be all wet.:eek:

Most of the industrial video coax I have worked with has/had a velocity factor ~0.70 speed of light.

It's fun to play with. It has a 3.579545MHz NTSC chroma subcarrier delay of about 2 degrees per foot. So if you attach a 45 foot length of cable to the loop input of a video monitor and leave the end of that length unterminated, the chroma disappears on the monitor. Short the end of the cable and luminance disappears, all you see is chroma.
 
Last edited:
Bob Scott said:
Most of the industrial video coax I have worked with has/had a velocity factor ~0.70 speed of light.
Right. The velocity factor is determined by the speed of light through the insulating medium surrounding the conductor. For a bare wire in free air, this is about 98-99% c. For typical plastic/foam insulation, it varies between ~60-75% c.
 
Bob Scott said:
Most of the industrial video coax I have worked with has/had a velocity factor ~0.70 speed of light.

It's fun to play with. It has a 3.579545MHz NTSC chroma subcarrier delay of about 2 degrees per foot. So if you attach a 45 foot length of cable to the loop input of a video monitor and leave the end of that length unterminated, the chroma disappears on the monitor. Short the end of the cable and luminance disappears, all you see is chroma.
Yeah, but I don't see what this has to do with my quote.:confused:
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top