my maths is rubish so i guess i shouldnt get involved but i am now intrigued with this! i had always assumed that for scaleing the number for 10bit was 1023 while LG is totaly convinced its 1024, my reasoning has always been that if you write out a 10 bit number on paper you start from 1 wich would give you 1023 segments.
i have no idea wich is correct especialy after reading the link from RB as both arguments seem to stack up!!! oh well is LG's project so i will chicken out and leave it upto him
but then again after thinking about it when dealing with ADC your actualy starting from 0 so i dunno