ADC voltage conversion

Status
Not open for further replies.

Tan9890

New Member
Hi..
I'm using ATMEGA32 in my Sound Follower project.
I'm getting a voltage of 0V to 0.7V out of my audio detection circuit..

How should i use these voltages as an input to the ADC? I mean, can the controller convert voltages having decimal points too?

Or do i need to add any new circuit? If possible, i would like to skip that step..because my hardware is completely ready..

Please give me some suggestions..

Thank You.
 
Most ADCs built in to ucontrollers are ratiometric to a voltage reference, which is typically the VDD for the chip (0 to 5V); sometimes to a built-in 2.5V voltage reference (0 to 2.5V).

You should amplify your signal so that it almost fills the full dynamic range of your ADC, whatever it says on the chip's data sheet.
 
What is the form of this voltage from the audio detection circuit, AC or DC? The ATMEGA32 has an A/D converter but it can only handle positive signals from 0V to Vcc, so if the signal is AC you will have to generate a DC offset to convert the signal. Two of the differential inputs have a selectable gain of 10X and 200X.
 
How should i use these voltages as an input to the ADC? I mean, can the controller convert voltages having decimal points too?
The ADC gives a binary number between 0 and full scale.

You can write it as a hexadecimal or decimal number if it is convenient for you but the ADC still only gave you a binary number.

Example: If an 8-bit ADC full scale is 2.55 volts and its input is 2.55 volts, it will give you a result of 11111111. If your input is 0.7 volts the result is 01000110. If you need a human readable "0.70" your software must do it.
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…