Hi all,
Ive been using micros for some time now have written a number of different scripts but have recently come to realise that my comprehension of the way timing is managed in PICs is flawed, i am trying to create timing sequence of 15 seconds.
My reasoning was, OSC is running at 8MHz and if i set up timer 1 in 16-bit mode, it overflows after 65535 cycles.
instruction cycle is 8MHz / 4 = 2MHz ----> i think this is the flaw in my logic
therefore if i want an interrupt of 10ms its:
10ms *(2*10^6)=20000
65535-20000=45535
therefore if i load 45535 into the timer and initialise it i can get an interrupt every 10ms give or take the extra instruction it needs to overflow
therefore if i use nested loops, i can get a longer delay.
so my question is, is this logic correct, i've asked a few of my colleagues in the lab and there is some uncertainty. we cant reach an consensus, what i do know is that using this logic, my timer is only lasting half the time it should, 8 seconds. perhaps i am using the wrong oscillator?