Hi,
Im currently creating a program which uses the on chip ADC. Currently, on an interupt, a section of code is run which i need to obtain the A to D result from multiple inputs and store them in some vars which i use later.
I plan to convert one input, then stores these results. Then change channel, convert that input etc.
The problem i have i think is to do with delay that is needed before the ADC can be used. I'm using timer0 to delay before the conversion completes, which is fine for the first time it is used, however the next time i try and delay for the next channel, it seems to hang and never leave the loop for the timer. If i ommit this timer for the second conversion, i get the wrong output... any ideas would be much appreciated...
PS: how do the timers work exactly... its all very confusing. For example what's a prescaler?
Thanx [/img]
Im currently creating a program which uses the on chip ADC. Currently, on an interupt, a section of code is run which i need to obtain the A to D result from multiple inputs and store them in some vars which i use later.
I plan to convert one input, then stores these results. Then change channel, convert that input etc.
The problem i have i think is to do with delay that is needed before the ADC can be used. I'm using timer0 to delay before the conversion completes, which is fine for the first time it is used, however the next time i try and delay for the next channel, it seems to hang and never leave the loop for the timer. If i ommit this timer for the second conversion, i get the wrong output... any ideas would be much appreciated...
PS: how do the timers work exactly... its all very confusing. For example what's a prescaler?
Thanx [/img]