If one was programming a microcontroller to perform operations in series with sampling music on an ADC and doing DAC conversion, how long might the developer have to stop the ADC sampling and start it back up again before the ear would notice?
The code example I'm using now, for example, has a delay of about 10 microseconds or so between filling a buffer from the ADC and conversion back out to DAC. Obviously your ear wouldn't recognize this delay. But what is the maximum delay I could get away with, though?
Anyone?
The code example I'm using now, for example, has a delay of about 10 microseconds or so between filling a buffer from the ADC and conversion back out to DAC. Obviously your ear wouldn't recognize this delay. But what is the maximum delay I could get away with, though?
Anyone?
Last edited: