Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

How long before the ear notices a delay?

Status
Not open for further replies.

krazy

New Member
If one was programming a microcontroller to perform operations in series with sampling music on an ADC and doing DAC conversion, how long might the developer have to stop the ADC sampling and start it back up again before the ear would notice?

The code example I'm using now, for example, has a delay of about 10 microseconds or so between filling a buffer from the ADC and conversion back out to DAC. Obviously your ear wouldn't recognize this delay. But what is the maximum delay I could get away with, though?

Anyone?
 
Last edited:
If I listen to the Beatles there is a delay of about 40 years between the recording and playback and my ear doesn't notice. As long as the output buffer is being fed at the sample rate the delay from input to output is irrelevant.

Or did I misunderstand?

Mike.
 
If I listen to the Beatles there is a delay of about 40 years between the recording and playback and my ear doesn't notice. As long as the output buffer is being fed at the sample rate the delay from input to output is irrelevant.

Or did I misunderstand?

Mike.

:p

I think you may have misunderstood.

The ADC->DAC loopback operation works like this:
ADC samples X samples into buffer at whatever sampling rate ->
DAC converts these samples and plays out at pin

The A/D conversion and the D/A conversion both take time because, in my implementation, one occurs first THEN the second occurs (they aren't both loading and unloading the buffer simultaneously).

As such, there is a delay, however minute, between the instant the signal is sampled and then played back.

My question is... how long can that delay be? I'd like to sneak some additional calculating in between the conversions but, obviously, don't want to sacrifice audio quality/continuity for it.
 
Last edited:
Assuming the ear is a lowpass filter with a cutoff frequency of 20kHz, I would think that the time period undetectable by the ear would be the period of a frequency that is 1-5x that of 20kHz (it really depends on where the attenuation level you define the cutoff to be, and how much the signal can change within that time period which is dependent on how much you low-pass filter the input signal).
 
Last edited:
:p

I think you may have misunderstood.

The ADC->DAC loopback operation works like this:
ADC samples X samples into buffer at whatever sampling rate ->
DAC converts these samples and plays out at pin

The A/D conversion and the D/A conversion both take time because, in my implementation, one occurs first THEN the second occurs (they aren't both loading and unloading the buffer simultaneously).

As such, there is a delay, however minute, between the instant the signal is sampled and then played back.

My question is... how long can that delay be? I'd like to sneak some additional calculating in between the conversions but, obviously, don't want to sacrifice audio quality/continuity for it.

The delay can be as long as the number of samples your buffer can hold.

Or, are you saying that the original signal and the delayed signal are mixed before playback?

Or, are you saying that you don't playback during sampling and so there are gaps in the output data?

Edit, having reread your question, I see you're stating the later. Can't you fix that so you are simultaneously filling one end of the buffer and emptying the other.

Mike.
 
Last edited:
I have a friend that DJ's and he uses special audio drivers for his computer that use small buffers to avoid sync issues. By general concensus anything under about 2ms is considering fast enough. If however there is a big change in the ADC value during the gap when it picks the sampling back up again there is going to be an audible 'tick' noise which anyone will hear.
 
Last edited:
A few milli-seconds of delay mixed with the original sounds like smeared sounds. More than a 100ms delay sounds like an echo.
 
In the recording studio I do not care what the delay is. Some of the audio processing gear has 2mS delay and 20mS would not be a problem. (Peak limiter and compressor).

Live: I hate to have much delay. I am fighting the delay across the room and any more is trouble. Each set of speakers (we add delay) to match how far they are from the stage. The ‘monitor’ or stage speakers have close to 0 delay. For stage use I do not use digital peak limiters with 2mS delay.

In your case 5 samples of delay will not be noticed. I would try not to get close to 1mS.
 
For reference, the speed of sound is about 1 foot per ms, so every foot of difference for the distance of two speakers to your ear will generate 1ms of delay difference.
 
I guess the delay in pure audio signals is of no importance at all.

Synchronizing pictures and sound might become a problem watching a movie, when lip movement and sound are not synchronous.

In that particular case the eyes will notice the delay. :)

Boncuk
 
Last edited:
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top