Greetings!
I put together the circuit from the attached schematic, and it works fine. The circuit uses a 9V source and the resistor called for is 4.7K.
What I can't figure out is how I would know what resistor value to use if I were to build this circuit on my own, or change the voltage or number of LEDs in use. Bear with me as I walk thru my understanding, and I hope you'll let me know where I'm going wrong. Sure it wont take long.
I built a circuit with just the 5 LEDs on 9V, and measured .006A between the last cathode and the negative terminal. I've read that LEDs (red) typically draw .02A, so first wondering if my measurement is incorrect.
From the a datasheet I read that the transistor is rated at .15A.
On the attached circuit, I measured .7V between the resistor and the base of the transistor which seems correct (as I've read a transistor requires .6V to switch on.) So, I figured the voltage drop from the resistor is 8.3V.
So, to determine the ohm rating required to get the .7 volts to the base, I figured (voltage souce - voltage drop) / amps would equal the ohm value of the resistor. (9-8.3)/.156 = 4.48.
4.48 makes no sense, especially considering the difference between 4.48 and the 4.7K in the schematic. But for kicks, I dropped the resistor value to 1K, then 470 and the circuit still worked. But down to 100ohms, and the transistor is no more.
I have a feeling I'm in way over my head on this, but hoped there might be a relatively simple idea I'm not grasping. I greatly appreciate your attempts to enlighten me! Thank you.
I put together the circuit from the attached schematic, and it works fine. The circuit uses a 9V source and the resistor called for is 4.7K.
What I can't figure out is how I would know what resistor value to use if I were to build this circuit on my own, or change the voltage or number of LEDs in use. Bear with me as I walk thru my understanding, and I hope you'll let me know where I'm going wrong. Sure it wont take long.
I built a circuit with just the 5 LEDs on 9V, and measured .006A between the last cathode and the negative terminal. I've read that LEDs (red) typically draw .02A, so first wondering if my measurement is incorrect.
From the a datasheet I read that the transistor is rated at .15A.
On the attached circuit, I measured .7V between the resistor and the base of the transistor which seems correct (as I've read a transistor requires .6V to switch on.) So, I figured the voltage drop from the resistor is 8.3V.
So, to determine the ohm rating required to get the .7 volts to the base, I figured (voltage souce - voltage drop) / amps would equal the ohm value of the resistor. (9-8.3)/.156 = 4.48.
4.48 makes no sense, especially considering the difference between 4.48 and the 4.7K in the schematic. But for kicks, I dropped the resistor value to 1K, then 470 and the circuit still worked. But down to 100ohms, and the transistor is no more.
I have a feeling I'm in way over my head on this, but hoped there might be a relatively simple idea I'm not grasping. I greatly appreciate your attempts to enlighten me! Thank you.