Hi guys,
I am using a 555 timer to create a 40kHz output signal. I have manage to generate that easily with a circuit built based on guide I found on internet, connecting with 2 resistor and 2 capacitor. But the problem that I found with that timer is that the output frequency of the circuit is changing with the supplied voltage. The supplied voltage range for the timer is 5V to 15V, and when I supply the voltage outside that range, the output frequency changes dramatically, which should be normal, but when I change the supply voltage between 5V and 15V, the output frequency still changes a bit, which I find is odd (When I supply 5V and tuned it to 40kHz, and increase the voltage slowly until around 15V, the frequency will increase till about 42.5kHz, and in my application, I would like it to be fixed at 40kHz). I thought it should not do that when it is running normally in its voltage range, as in, the output frequency should not be affected by the supplied voltage. I would like to ask, is that normal that the frequency will change even in its supply voltage range? Or it is just some fault with the circuit or my 555 chip? If it is normal situation, could you guys explain a little bit more in detail why is that so?
Thanks a lot guys.
I am using a 555 timer to create a 40kHz output signal. I have manage to generate that easily with a circuit built based on guide I found on internet, connecting with 2 resistor and 2 capacitor. But the problem that I found with that timer is that the output frequency of the circuit is changing with the supplied voltage. The supplied voltage range for the timer is 5V to 15V, and when I supply the voltage outside that range, the output frequency changes dramatically, which should be normal, but when I change the supply voltage between 5V and 15V, the output frequency still changes a bit, which I find is odd (When I supply 5V and tuned it to 40kHz, and increase the voltage slowly until around 15V, the frequency will increase till about 42.5kHz, and in my application, I would like it to be fixed at 40kHz). I thought it should not do that when it is running normally in its voltage range, as in, the output frequency should not be affected by the supplied voltage. I would like to ask, is that normal that the frequency will change even in its supply voltage range? Or it is just some fault with the circuit or my 555 chip? If it is normal situation, could you guys explain a little bit more in detail why is that so?
Thanks a lot guys.