Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

How to delay an analog signal for variable amount?

Status
Not open for further replies.

pavjayt

Member
Any ideas on how to delay an analog signal for variable amount of time?

Analog signal: 0-1Vp-p
Bandwidth: 20MHz

Should be able to delay it anywhere between 5ns to 0.5us. Is this possible? came across the attached article that says it can achieve ~11ns delay, but not sure about the circuit as it doesnt show anymore. Can this be extended to make this delay variable?

thanks
 

Attachments

  • electronicdesign_10537_anaccurateanalogdelaycircuit.pdf
    63.8 KB · Views: 631
Last edited:
Yeh, thats the kind of first response I see on all posts related to this on any forum :). Coming back to reality, is there any electronic way to do this rather than running spools of cables around the lab?
 
The only thing that comes to mind is an ADC to memory followed by a DAC. Some of the Digital Signal Processors will handle it.

Mike.
 
Yeh, thats the kind of first response I see on all posts related to this on any forum :). Coming back to reality, is there any electronic way to do this rather than running spools of cables around the lab?
Using cables as a signal delay is/was reality.

Some 40 years ago I was connected with a local television station when they moved into a new studio. The design spec called for every TV signal (camera, VCR etc,) in the building to have the exact same timing as any other signal, so that the central matrix switch could change between any signal without the screen glitch that would occur if the two sources weren't in sync.

To do this, they had a central sync signal generator. The cables from the sync generator were all the same length. If the furthest video source in the building was 300 feet away, then ALL cables were 300 feet long. And, ALL cables from each video source in the building would also be 300 feet long. This ment that there were 10s of thousands of feet of cable coiled up all over the building.

There were variable delay systems that could have been used to do the same job but, at the time, they were very expensive. I expect that with the current state of technology that a digital delay would be much cheaper than those matched cables. And, as TV production is likely all digital nowadays, probably not necessary.

But, to answer your question, I agree with what Pommie said in post #5.
 
The cables from the sync generator were all the same length.
This is the reason why the BBC's original Television Centre in London is round - so that the cable runs from all the studios to the control room, in the centre, were the same length.
 
The old PAL analogue TV decoders used two delay lines, one (the chroma delay line) delayed one line length, 64uS, in order for phase cancellation to cure one of the biggest failures of NTSC and was a 'crystal' delay line. The other (the luma delay line) much shorter one was to delay the luma to account for the chroma processing delay - so luma and chroma matched up - this was basically a long coil, with many thin windings - and some of the very early sets used a coiled up length of coaxial cable to provide the short delay.

The OP's requirement looks quite difficult?, particularly presuming he needs good accuracy and low distortion from DC to 20MHz.
 
Analog video recorders had a circuit called a timebase corrector to remove video sync timing errors created by minute changes in velocity of the rotating heads. The Ampex AMTEC had a voltage-variable L-C delay line, with varactors for the capacitors. Bandwidth was above 5 MHz. The delay control signal was derived by comparing the played sync pulse to an external reference sync pulse. Back in the 70's, I built one as part of a project to resurrect an old clunker with all new electronics.

A second corrector (the COLORTEC) fine-tuned the signal with another variable delay line. This line had a smaller adjustment range, a faster response time, and much better precision. The control signal was based on color burst rather than sync. The next generation machine (Ampex AVR-1) used fixed delay lines with binary-weighted delay values, and switched them in and out of the signal path with very fast analog switches.

Each delay line was a long ferrite rod inductor with a single-layer winding with many taps. Each tap had a varactor diode, and all of the diodes were driven together.

Either method (variable or fixed/switched) should work for you, although getting a 100:1 adjustment range out of a variable line will be difficult.

ak
 
Last edited:
back in the 1980s there were "bucket brigade" analog delay chips which consisted of a pair of rows of MOSFETS and capacitors. the input signal was sampled by the first MOSFET pair, and the voltage stored on the first pair of capacitors (basically a simple sample/hold circuit). a clock pulse would advance the stored voltage on to the next stage, while getting the next sample from the input. the sample rate and delay were controlled by the clock frequency. these were only audio bandwidth devices. you could probably do something similar for a 20Mhz bandwidth, but the clock frequency would have to be at least 200Mhz. these days such functions of analog delay are done with DSP. RF DSP devices are expen$ive.

analogkid has a neat idea which you could likely get some fine grained control out of without being continuously variable. select the smallest time step you can work with, then cut a piece of coax twice as long, and 4 times as long, and 8 times as long this would be the first stage. the comparable steps in the next stage would be 10, 20, 40, and 80, then a third stage would be 100, 200, 400, and 800.. you could probably get away with using mini toggle switches arranged to short the input side of each piece of coax to the output side when the toggle is down, and the signal must pass through the coax when the toggle is up, you arrange all the coax pieces inline to each other, and you now have a delay line with a set of BCD (binary coded decimal) switch banks to change the delay time.
 
The BCD is a nice idea unclejed, just like a switched attenuator, but with the attenuator pads replaced by coax delay lines.

JimB
 
The AVR-1 values were binary stepped, not BCD - more efficient. With BDC you get 9 delay values in 4 bits; with binary you get 15.

To the TS: What is the signal? What is its waveform / waveshape? Is it asymmetrical about GND? Does it have a DC offset? Does the delay value have to change while a live signal is going through it, or is it more like change the value, run the signal, change the value, run the signal ...? Is it critical that the delay value change continuously / smoothly, or is a fine-grained stepped delay acceptable?

ak
 
Last edited:
The BCD is a nice idea unclejed, just like a switched attenuator, but with the attenuator pads replaced by coax delay lines.

JimB
with the hand switches you would (if you needed to) switch all of the switches on in a decade, or use it in a binary/decimal format. it's kind of "non-standard" to do it that way, but it does add a small bit of flexibility, but it takes a bit of addition to figure out the actual numeric value of the delay... the max value would be 1500(d)+150(d)+15(d)=1665(d)
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top