Hippogriff
Member
Hi all,
I'm new here and I am probably guessing that "LED / Resistor" questions are quite common. Hopefully I'm not going over too much old ground but a view from some folk in the know on here would be very helpful to me.
I have a lovely RGB star LED that I want to control with a PIC. That's all working fine, but I want to make sure I have the LED as bright as possible without it exhibiting its deathwish. The LED has forward voltage figures of Red = 2.5 volts, Green = 3.8 volts, Blue = 3.8 volts and a maximum forward current of 350 mA. The power supply is 5 volts.
I figured, using the equation, that I should be able to use resistors of around 7 Ohms, 3 Ohms and 3 Ohms for red, green and blue respectively. Going up a bit, for safety, I then figured on 10 Ohms, 5 Ohms and 5 Ohms.
Are those the lowest value resistors that I could use to ensure this LED is the brightest it can be from a 5 volt power supply? I only ask because the circuit that I was initially following used 5 Ohm, 10 Ohm and 15 Ohm respectively - I think that was designed to give a "warm" colour when the LED was white!
Anyway... I feel relatively comfortable with that equation. You work it out and then add a little bit for safety - that seems fine. I'm just looking for confirmation there.
However, what I don't understand is the wattage of said resistors and whether it even matters. I have available to me some 0.6 watt resistors and some 1 watt resistors.
If I choose to use the 0.6 watt resistors or the 1 watt resistors, will I affect the brightness of the LED or will it have no bearing whatsoever? I've tried to check with my eyes... but, ummmm , I can't seem to tell - so I'm wondering if anyone can tell me what the real story here is.
I'd appreciate any advice that anyone on the forum can give - obviously I'm only starting out with this stuff - so assuming I know little would be good.
I'm new here and I am probably guessing that "LED / Resistor" questions are quite common. Hopefully I'm not going over too much old ground but a view from some folk in the know on here would be very helpful to me.
I have a lovely RGB star LED that I want to control with a PIC. That's all working fine, but I want to make sure I have the LED as bright as possible without it exhibiting its deathwish. The LED has forward voltage figures of Red = 2.5 volts, Green = 3.8 volts, Blue = 3.8 volts and a maximum forward current of 350 mA. The power supply is 5 volts.
I figured, using the equation, that I should be able to use resistors of around 7 Ohms, 3 Ohms and 3 Ohms for red, green and blue respectively. Going up a bit, for safety, I then figured on 10 Ohms, 5 Ohms and 5 Ohms.
Are those the lowest value resistors that I could use to ensure this LED is the brightest it can be from a 5 volt power supply? I only ask because the circuit that I was initially following used 5 Ohm, 10 Ohm and 15 Ohm respectively - I think that was designed to give a "warm" colour when the LED was white!
Anyway... I feel relatively comfortable with that equation. You work it out and then add a little bit for safety - that seems fine. I'm just looking for confirmation there.
However, what I don't understand is the wattage of said resistors and whether it even matters. I have available to me some 0.6 watt resistors and some 1 watt resistors.
If I choose to use the 0.6 watt resistors or the 1 watt resistors, will I affect the brightness of the LED or will it have no bearing whatsoever? I've tried to check with my eyes... but, ummmm , I can't seem to tell - so I'm wondering if anyone can tell me what the real story here is.
I'd appreciate any advice that anyone on the forum can give - obviously I'm only starting out with this stuff - so assuming I know little would be good.