Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Increasing frequency

Status
Not open for further replies.

robotkid

New Member
Hi,

I am looking for some help regarding a project idea I have had.

I want to create a circuit that will give me an output frequency 30% faster than the input frequency.

if the input freq = 100Hz output frequency = 130Hz
input freq = 150Hz output frequency = 195Hz

would it be possible to do this using only hardware (maybe some kind of 555 circuit?)

in adition to this would it be possible to tweak the 30% slightly using a POT or something? - this would be for fine tuning the equipment.

TIA
Robotkid
 
Doing this using simple circuits IS possible. But it won't be possible for a wide range of frequencies.

The principle is easy:
Input --> fixed pulse length --> Low pass filter --> VCO

Another limitation is that it wil have a slight delay because of the lp filter.
 
Thanks for the reply Grossel,

what limits the range of frequencies? is that because of the low pass filter?

the delay would not cause a problem in this application.
 
Hi.

Maybe somebody here has another aproach than me. Any VCO i've seen has a non-linear relationship between input voltage and output frequenzy. On the other hand, The fixed pulse length circuit and the lp filter make a frequenzy to voltage converter, and that IS actually linear - and does only work in a limited frequenzy range.
Those two factor make this solution very simple to make, but will work within a limited freq range.

In your case, If you fine tune the timers so that 100HZ -->130HZ, then you'l discover that 150HZ input will output higher than 195HZ. That's because the relationship between output and input isn't linear but rather exponential.

You may use another stage between, a so called logarithmic amplifier. But that makes the circuit not so easy as you need to either calculate the antilog amplifier, or you have to spend a lot of time tuning it right.
 
How about using a microprocessor to measure the input frequency, multiply it by 130%, and generate the output frequency.

What type of frequency resolution (minimum frequency change) do you need to detect?
 
How about using a microprocessor to measure the input frequency, multiply it by 130%, and generate the output frequency.
Do you know about an algorithm that is suitable for this? (wonder about this question too)

What type of frequency resolution (minimum frequency change) do you need to detect?
He uses 100-150 hZ as input in hes example. That's probably the range of input freq.
 
Last edited:
Do you know about an algorithm that is suitable for this? (wonder about this question too)

He uses 100-150 hZ as input in hes example. That's probably the range of input freq.
The algorithm is quite simple:
Sample the input.
Calculate the input frequency.
Multiply that by 1.3.
Use that value to generate the output frequency.

By resolution I mean with what precision do you need to measure the frequency. 1Hz, 0.1Hz, 0.01Hz, etc.?
 
The algorithm is quite simple:
Sample the input.
Calculate the input frequency.
Multiply that by 1.3.
Use that value to generate the output frequency.
Well, yes. But making real time freq converting (as good as no delay between input/output) I cannot see how to do this without using some sort of multitasking.
Maybe I think a little too difficult?

By resolution I mean with what precision do you need to measure the frequency. 1Hz, 0.1Hz, 0.01Hz, etc.?
Oh, sure. My english a little rusty :eek:
 
Well, yes. But making real time freq converting (as good as no delay between input/output) I cannot see how to do this without using some sort of multitasking.
Maybe I think a little too difficult?
The delay for a standard microprocessor loop should should be no more than a few ms. Is that too much?
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top