Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

How to delay a 1Hz square wave with microsecond precision

Status
Not open for further replies.

csnsc14320

New Member
I have a 1Hz square wave that I would like to add a variable delay between 1 microsecond to 1 second (1,000,000 microseconds).

The idea is this:
- The input 1Hz square wave rises
- After X amount of microseconds (where X is chosen before hand, between 1-1million microseconds), output HIGH.

Basically I want to phase shift the 1Hz signal by a specified time. From there, I would like to be able to determine when the output stops giving HIGH (ie. while the square wave will give out HIGH for .5 seconds then LOW for .5 seconds, I may only want the output to output high for say .2 seconds, then drop back to low until the 1Hz wave goes from LOW to HIGH again).

The output will be run through an LED, which will represent the delayed signal. A photo detector will then record when the LED goes off to verify that the delay is correct.

Anyone have any ideas with this?

I would assume to do it with hardware such as flip flops and counters would be a pain to get the "variable" delay aspect, so I am leaning towards a microcontroller. I have been experimenting with an Arduino with a 16Mhz clock but I don't think that will be able to do the job anymore
 
Why not? The Arduino would be plenty for the kind of precision timing you're asking for, though you probably would need to write the code in assembly to calculate out every instruction cycle variable.

The execution speed of a 16Mhz AVR is 62.5 NANO seconds. Making a usec accurate delay would require a decent bit of careful coding but I'm sure it can be done.

Just to give you an idea of what you have to do, in order to get a 1usec delay with a 62.5ns clock you have 16 instruction cycles to work with before you change the I/O pin, in ASM you can do quiet a bit in that period of time. You'd probably need near a dozen seperate conditionally based routines branched near the that 16 instruction cycle barrier but anything past a few dozen instruction cycles would require nothing more than a clock speed timer interrupt and an accurate clock, you do have to calculate in the number of clock cycles it takes for the interrupt to occur, but again this is nothing more than something that needs to be figured out, it can be done.

Considering how interrupts and the delay code would have to work the only problem would be with very short delays. If you can limit your minimum phase shift to 25us (or so) you can get a resolution all the way down to the time constant of the clock (62.5us) with a single routine. The devil is when your time constant gets close to the instruction clock cycle, because the branch instructions or compares will create variable noise on the starting edge between each pulse.
 
Last edited:
I am having a somewhat difficult time trying to understand how to do this then. I mean, just telling the arduino to turn on/off pin 13 (the LED), or any pin for that matter, only yields a square wave of about 200KHz, so there seems to be some delay time that would prevent me from getting 1MHz?

I have a 1MHz clock source that I am trying to use with the timers on the board but I am finding the datasheet to be somewhat confusing and I can't quite get it to work =\

The process I ideally envision:
- receive rising edge of 1Hz signal
- start a counter clocked @ 1MHz to count a predetermined amount of time (1count to say 6 for a delay of 6 microseconds)
- output HIGH for a specified amount of clock cycles
- output LOW, then wait for the next low->high transition from the 1Hz wave

so we control the delay of the signal, and how long the output remains high until it is turned off (while we may not be outputting HIGH for a full second, the frequency of the low->high of the output will still be 1 Hz)
 
If you use a PIC with a capture module (CCP1) then it can capture the free running timer when the input / edge occurs. Then that timer value can be used to sync the delay to an exact timer count, so 1uS resolution is easy enough.

However one problem will be with very short "delays" so it would be better in a range 500-1000000uS which allows some processing time.

Alternatively if the input frequency is constant then you can get 0-1000000uS but you just delay it exactly one second.

So basically you generate an exact and regular 1 second pulse, and just maintain the phase delay between the constant input freq and the output generated freq.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top