I have built an animatronic figure and wanted to synchronize its mouth movements to an audio signal. The mouth is operated by a servo motor controlled by a Lynxmotion SSC-32 controller. The SSC-32 has an on-board A/D converter.
I am an electronics hobbyist with no formal training. I have been able to construct some digital designs, but this is my first foray into the world of analog. I searched the web for relevant circuit designs and with the assistance of this and other electronics forums, and simulation using LTSPICE, was able to develop my circuit, which I call the AAEF (Audio Amplifier & Envelope Follower).
My design objective was to input an audio signal, scale it to the supply voltage of the controller (which is also the reference voltage of the A/D converter), and process it through an envelope follower producing a 0-4+VDC signal that mapped to the amplitude of the original signal. Refer to the circuit below.
The first part of the envelope follower portion of the circuit is the scaling of the signal, using an op amp. I used a trim pot to adjust the gain, as different sources will have different ranges; the use of line versus headphone sources required adjustments. It is also sensitive to the volume settings of the source (naturally).
The op amp output is sent to an envelope follower which is a simple RC circuit to smooth out the remaining audio oscillations. I used a Schottky diode to minimize DC signal loss.
There is also an audio amplifier circuit based on the old standby LM386 on the board as well.
Additionally, I used the two separate channels of a stereo signal. The right channel is sent to a speaker in the animatronic's head; the left channel is sent to the envelope follower portion of the circuit. This allows me to "tweak" the amplitude of the original circuit to provide an exaggerated envelope and thus more mouth detail. Also, the output of this circuit and subsequent A/D conversion are processed by software, introducing a barely noticeable lag. I can delay the audio portion to match the controlling signal with an audio editing program. I used Audacity.
Below are two graphs, comparing the original audio signal to the values returned by the software (written in FreeBASIC) after processing through the circuit and the controller.
The software analyzes the returned digital values and sends signals to the servo to move the mouth in synchronization.
Here is my test setup.
Here is the circuit in action on my animatronic penguin (named Peter) AAEF in Action