I need a little help calculating the proper value resistor to use to light an LED.
The power supply is rated at 12v but actually puts out 13.2v. The LED runs at 2.1v and 20ma.
I took the supply voltage, less the LED voltage, and divided by the amperage eg (13.2 - 2.1) / .02. That gave me 555 ohms for the resistor.
Questons:
1. Did I do this correctly?
2. How accurate does the resistor value have to be? Obviously 555 is an oddbal size. How close to I need to get to have the LED light properly and not burn out for a reasonable time?
Thanks,
The power supply is rated at 12v but actually puts out 13.2v. The LED runs at 2.1v and 20ma.
I took the supply voltage, less the LED voltage, and divided by the amperage eg (13.2 - 2.1) / .02. That gave me 555 ohms for the resistor.
Questons:
1. Did I do this correctly?
2. How accurate does the resistor value have to be? Obviously 555 is an oddbal size. How close to I need to get to have the LED light properly and not burn out for a reasonable time?
Thanks,