I was looking at some voltage references for ADCs (at Maxim) and I was wondering something...
There are 16-bit ADCs (avtually, 18-bit ADCsbut are accurate to 16-bits), but the most accurate and stable voltage reference I could find was nowhere near 16-bits. It's variation over temperature was great enough that it would only be noticed by a 12-it ADC or less.
So what is the point of a 16-bit ADC if you can only get a voltage reference that is accurate to 12-bits?
There are 16-bit ADCs (avtually, 18-bit ADCsbut are accurate to 16-bits), but the most accurate and stable voltage reference I could find was nowhere near 16-bits. It's variation over temperature was great enough that it would only be noticed by a 12-it ADC or less.
So what is the point of a 16-bit ADC if you can only get a voltage reference that is accurate to 12-bits?