• Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

How much time PIC micro take to complete tasks

Djsarkar

Member
I have general questions, assuming we have PIC12F1840, Sensor X to detect object and Output device Y.

Task 1 ,When sensor detects object, micro read the output of sensor and finish first job

Task 2, Turn on device Y

Response time of X is 60 ms
Response time of Y is 30 ms

IMG_20200809_101602.jpg

How much approximate time PIC micro take to complete two tasks?
 

Djsarkar

Member
Any reason it wouldn't be 90mS?

Mike.
Thanks Mike, so system take 90 ms to complete two tasks

But can both these tasks be completed in less then 90 ms time?

I think this can be done using timer interrupt. If yes then how long will have to set the timer interrupt?
 

Pommie

Well-Known Member
Most Helpful Member
Assuming you're using an ultrasonic distance sensor then the time is dependant on the distance. Turning on an output pin will be very fast - with 32MHz clock it'll take 125nS. If you tell us what you are trying to do and what pics you have then we might be able to help more.

Mike.
Edit, if you use a photo interrupter then the sensing will be almost instantaneous too.
 

Djsarkar

Member
If you tell us what you are trying to do and what pics you have then we might be able to help more.
it is a theoretical question . I am not doing any practical. I just want to understand how can we save CPU time at the time in program design

As given in my example, pic micro take 60 milliseconds to read sensor state but pic is faster than sensor so it takes few nano seconds to read sensor so pic is just waiting so much time to read.

Input response is so late that's why I want to do second task between this time

So I think timer interrupt may play important role in the system
 

Nigel Goodwin

Super Moderator
Most Helpful Member
it is a theoretical question . I am not doing any practical. I just want to understand how can we save CPU time at the time in program design

As given in my example, pic micro take 60 milliseconds to read sensor state but pic is faster than sensor so it takes few nano seconds to read sensor so pic is just waiting so much time to read.

Input response is so late that's why I want to do second task between this time

So I think timer interrupt may play important role in the system
Not at all, it seems an unlikely use for a timer interrupt, mostly simple loops are all that's required.

Bear in mind, the 12F1840 you mention runs at up to 32MHz (on it's internal oscillator), this means it can execute 8000 instructions per millisecond. PIC's generally spend almost all their time just waiting for an external event.

If you just ask meaningless made up 'general questions', you'll never get a sensible reply, because there can't be one - every solution is different, although anything similar is likely to be done in a very similar way.
 

rjenkinsgb

Well-Known Member
Most Helpful Member
Personally, I'd use an input capture so the PIC can be doing other things until the sensor returns a signal, if high precision is needed, or just check the input in a fast timer interrupt for lower precision. With 10KHz interrupts you would get roughly +/- 0.7" or 17mm steps.

Ultrasonic measurement is a very slow system in computer terms; the round-trip delay is almost two milliseconds per foot distance.

One of the most basic principles of real-time programming is never use delay loops - use interrupts or proceed with the main program and check again next time around the loop.

The actual calculation time to convert each reading to distance units would be a few microseconds at most.
 

Djsarkar

Member
If you just ask meaningless made up 'general questions', you'll never get a sensible reply, because there can't be one - every solution is different, although anything similar is likely to be done in a very similar way.
When I am designing the program, I have two methods. In which I have to choose best one .

I think the interrupt method is the best because it take less time so systems will work at high speed. Obviously this method will benefit in complicated system

IMG_20200809_140859.jpg
 

Pommie

Well-Known Member
Most Helpful Member
If you use a reflective opto sensor to detect the item then reading it will be almost instantaneous. BTW, you won't learn very much with theoretical questions.

Mike.
 

Djsarkar

Member
BTW, you won't learn very much with theoretical questions.

Mike.
Can you please clarify why I should not use timer interrupt in my system. If I use the interrupt, I do not see any disadvantages in my system, My system takes less time because of interrupt. Where is the reason to avoid timer interrupt?
 

Nigel Goodwin

Super Moderator
Most Helpful Member
Can you please clarify why I should not use timer interrupt in my system. If I use the interrupt, I do not see any disadvantages in my system, My system takes less time because of interrupt. Where is the reason to avoid timer interrupt?
He's not said anything like that - he merely said, as many of us have, you're asking theoretical questions that won't help you in any way, as any answers are just as theoretical.

As you've never given us the slightest hint what you're wanting to do, (and most of your questions make no sense) we can't suggest if timer interrupts are a good idea or not.

However, perhaps you're not aware? - but many very early low-spec PIC's didn't even provide interrupts at all.
 

Pommie

Well-Known Member
Most Helpful Member
There's no need for a timer interrupt. An edge triggered i/o inerrupt would make more sense. Why do you think interrupts make things quicker?

Mike.
 

Beau Schwabe

Active Member
An interrupt won't necessarily make things faster, it will just make the timing deterministic if you use the interrupt as a "throttle".

By making things deterministic, (i.e. setting your interrupt to have a 1ms "heartbeat") you can create your delay loops so they always have a predictable interval. That way if you add more switches or sensors in code, the delay does not become code dependent on other delays previously instantiated.

We have a similar situation at work where we are checking an IR reflective sensor... on occasion the bottle on the conveyor belt creates a false reflection reading. So on our debounce circuit there are two delays ... one that prevents false triggering and waits after the first trigger of the event, (when the pin goes HIGH), and another that triggers after the pin has gone LOW ... the flow is designed as a "fall-through" so that any one delay does not hold up valuable CPU time so that another delay can be handled in an atomic way.

The Flow chart below only depicts ONE input. The reference to 2ND-SHOT is simply a second ONE-SHOT flag.
 

Attachments

Last edited:

Djsarkar

Member
i agree with the advice given, A correct question gets the right answer, May be i was thinking too much,

I take a step back and try to understand, when timer interrupt is needed. Mostly I found the reason to use timer interrupt but I am looking real time example
 

rjenkinsgb

Well-Known Member
Most Helpful Member
Mostly I found the reason to use timer interrupt but I am looking real time example
Although the source is a hardware timer, I think of it as the software "clock" interrupt, to provide consistent timing and regular updates of things.

eg. In the same meaning as the clock oscillator for a hardware system, not a time-of-day clock.
 

Pommie

Well-Known Member
Most Helpful Member
Assuming time of flight sensor then a free running timer and an interrupt triggered by the pin receiving the trigger pulse would be more accurate. However as this is just theoretical it's pointless trying to work out the best method. Djsarkar, do you, or are you able to, create an actual project?

Mike.
 

Djsarkar

Member
Djsarkar, do you, or are you able to, create an actual project?

Mike.
Yes I have completed school assignment. We had old development board (8051). I have worked with keil. As well as studied pic micro datasheet
 
Last edited:

Djsarkar

Member
What actual chips and sensors do you have available to you?

Mike.
I do not have a development board at home. Schools are closed due to covid 19. As soon as possible I will have new development board. I know I can not test program without board, that's the reason ask some theorytical question
 

rjenkinsgb

Well-Known Member
Most Helpful Member
You do not need any specific "development board" - just get some stripboard, IC sockets and appropriate components for whatever project you want to try.
It's then permanent, rather than something that has to be scrapped to do something different on the development board.

You do need a programmer to work with PICs, but you can get an PICKIT one quite cheaply from ebay.

These are a few bits of mine, within reach of my desk at the moment:

prototypes.JPG
 

Latest threads

EE World Online Articles

Loading
Top