# Calculate Resistor for LED use?

Status
Not open for further replies.

#### LEDman

##### New Member
I need a little help calculating the proper value resistor to use to light an LED.

The power supply is rated at 12v but actually puts out 13.2v. The LED runs at 2.1v and 20ma.

I took the supply voltage, less the LED voltage, and divided by the amperage eg (13.2 - 2.1) / .02. That gave me 555 ohms for the resistor.

Questons:
1. Did I do this correctly?
2. How accurate does the resistor value have to be? Obviously 555 is an oddbal size. How close to I need to get to have the LED light properly and not burn out for a reasonable time?

Thanks,

#### MikeMl

##### Well-Known Member
First, your calculation is correct. You could use a 560, 620, or 680 Ohm resistor (standard resistor values). However, there is likely no need to run the LED at full current. For just indoor panel indicator use, I typically use only 5 or 10mA, which gets you closer to a 1K resistor.

For some battery powered applications where the LED current is a significant fraction of the total power, I have used some "ultra-bright" LEDs, and operated them at ~1mA, and they are still good to go, even when viewed outdoors in sunlight.

Last edited:

#### LEDman

##### New Member
Thanks

Thanks Mike. I tend to forget that LED's are very bright these days. I'll try it out with 1K resistors and see how they look.

Bob

#### audioguru

##### Well-Known Member
I made many LED chasers and an audio VU meter with ultra-bright but wide angle LEDs operating at 25mA. They are bright but not too bright.
It is cheap LEDs that are too bright (when they point directly at you) because they focus the beam into a very narrow angle.

Your vision's response to brightness is logarithmic so that you can see in sunlight and also see in star-light. Then half the current in an LED produces half the brightness that is only slightly dimmer.

Status
Not open for further replies.

Replies
11
Views
4K
Replies
1
Views
810
Replies
3
Views
803
Replies
3
Views
753
Replies
7
Views
866