Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Why? PIC Frequency Meter code

augustinetez

Active Member
I'm having a bit of a brain fart here.

With regard to the following text from a PIC Frequency Meter/Counter project (uses 4MHz xtal)

The PIC implements a 32 bit counter, partly in internal hardware and partly in software. Counting is enabled by turning off the internal pull-down transistor for "exactly" 0.4 second. At the end of this time, the PIC divides the count by 4, then adds or subtracts the appropriate IF frequency to get the actual frequency. The resulting count is converted to printable characters and delivered to the display.
unless I'm completely missing something here re "counting for 0.4 seconds and dividing the result by 4", wouldn't you just get the same result by counting for 0.1 seconds?

We're talking counting/measuring frequencies in the 1 - 50MHz range.

I've noticed several such projects around the 'net that do the same/similar thing.
 
The longer time can make the accuracy better, and possibly reduce jitter. If there is any difference in the turn-on and turn-off times of the pull-down transistor, that will affect how long the frequency is counted for. By measuring for a longer time, that difference is a smaller fraction of the counting time.
 
I could see that as a possibility.

Most of the projects either have a hardware (adjust the xtal frequency) or software calibration process to trim up for the correct frequency display but being amateur style projects, I'm not sure there was much regard to jitter or n'th degree of accuracy involved.
 
My meter allows 0.1sec time window (actually 0.4sec shifted two paces right to get 0.1sec) and that resolves the accuracy of the last bit. Meaning, you can differentiate from a 32768.5Hz and 32768.0Hz crystal vs just 32768 vs 32770Hz.
 
But accuracy is ultimately dependent on the accuracy of your timebase. And your ability to accurately time events with a PIC. In the old days, with assembly, you could count clock ticks and make sure everything aligned. Now you have to carefully use counters and/or some off chip hardware to synchronize everything. Not worse, just different (less straightforward of counting clock cycles). Ultimately, I'd rather use timers and let the chip do the parts it was designed to do.
 
But accuracy is ultimately dependent on the accuracy of your timebase. And your ability to accurately time events with a PIC. In the old days, with assembly, you could count clock ticks and make sure everything aligned. Now you have to carefully use counters and/or some off chip hardware to synchronize everything. Not worse, just different (less straightforward of counting clock cycles). Ultimately, I'd rather use timers and let the chip do the parts it was designed to do.

Assembly still works today, as well you know :D

But even in the 'old days' with assembler, you still used hardware counters for frequency counters, it's how you could get 50MHz+ from a 4MHz clock speed. The classic frequency counter was on a VERY old application note, and used an OTP PIC, and 7 segment LED displays.

Later on it was ported to the 16C84 (the first electronically re-programmable chip), and the display was changed to an Hitachi style LCD module - various updates using more modern devices have appeared over the years.

Not, on enhanced devices, it's even easier, as they have many more peripherals with much better controls.
 

Latest threads

New Articles From Microcontroller Tips

Back
Top