18F4431 - Assembly - Xtal 4 MHz
2 inputs ADC conversion with Vref+ = Vdd /Vref- = ground - one shot mode (every 3 seconds) with simultaneous sampling - Tacq = 2 TAD as per datasheet.
10-bit values read sequentially from the ADRES buffer.
For testing, the inputs come from two resistive dividers (total R <5K ea). Noise is around 5 mV pk to pk in both.
I implemented oversampling (12-bits) to get 1,2 mV resolution. Maths for it and for scaling seem flawless and basically I am getting on the display, steady values with 1 to 3 mV difference with the inputs.
What puzzles me is that at a certain input values, I can start getting periods that could last up to several seconds where one of the channels jumps back and forth from the "correct" value to one that is, most of the time, some 7 to 9 mV above and fixed. Never anything in between. That variation, has not a timing pattern that I could say.
Examples:
Measured at the input: 2456 mV
Result on display: 2458 mV
Value occasionally offset to: 2465 mV
Measured at the input: 1639 mV
Result on display: 1641 mV
Value occasionally offset to: 1650 mV
I ruled out this as specific of a particular AN in the micro by swapping inputs / dividers and appears as occurring for some values. I swapped micros as well.
I am puzzled by the fixed offset that does not change.
My question: could this be an artifact of the oversampling? I do not see why but I ask simply because I do not know whom to blame.
What anyone would suggest to look at? Gracias.
2 inputs ADC conversion with Vref+ = Vdd /Vref- = ground - one shot mode (every 3 seconds) with simultaneous sampling - Tacq = 2 TAD as per datasheet.
10-bit values read sequentially from the ADRES buffer.
For testing, the inputs come from two resistive dividers (total R <5K ea). Noise is around 5 mV pk to pk in both.
I implemented oversampling (12-bits) to get 1,2 mV resolution. Maths for it and for scaling seem flawless and basically I am getting on the display, steady values with 1 to 3 mV difference with the inputs.
What puzzles me is that at a certain input values, I can start getting periods that could last up to several seconds where one of the channels jumps back and forth from the "correct" value to one that is, most of the time, some 7 to 9 mV above and fixed. Never anything in between. That variation, has not a timing pattern that I could say.
Examples:
Measured at the input: 2456 mV
Result on display: 2458 mV
Value occasionally offset to: 2465 mV
Measured at the input: 1639 mV
Result on display: 1641 mV
Value occasionally offset to: 1650 mV
I ruled out this as specific of a particular AN in the micro by swapping inputs / dividers and appears as occurring for some values. I swapped micros as well.
I am puzzled by the fixed offset that does not change.
My question: could this be an artifact of the oversampling? I do not see why but I ask simply because I do not know whom to blame.
What anyone would suggest to look at? Gracias.