Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

LED Current Limiting Resistor

Status
Not open for further replies.
I'm embarrassed to ask such a basic newbie question here.

Is there a reliable rule of thumb to choose a suitable resistor value for a single LED where the current and forward voltage are unknown?

Like I imagine a lot of hobbyists do, I have a drawer full of spare LEDs with no specs. or case markings. What I've usually done in the past is simply connect one up to a bench power supply set to the voltage I want to run the LED at, and with a resistor decade box in series. I then gradually reduce the resistance until the LED brightness looks about right and the device isn't overheating etc. It would be nice to do this without the guesswork however and to know I'm not shortening the life of the LED by using too low a value.

What I want to do is fit a 5mm, high-intensity green LED on a motorcycle which uses a 12v, 16Ah lead/acid battery. The intention is to provide a visual indicator on the handle bars when a certain micro switch opens/closes.

The LED I have seems to run ok at around 3 - 3.5v DC on my bench PSU (no resistor in series). I would like to feel reasonably I've got a value somewhere near correct for the 12v supply so I don't burn the LED and have to take the bike apart again.

I would be grateful for any advice.

Trevor
 

Attachments

  • Spec Sheet.jpg
    Spec Sheet.jpg
    130.1 KB · Views: 3,592
1) Your 12 v battery runs about 15 v when charging. Maybe a little higher.
2) The voltage across the resistor is simply the (battery voltage - the LED voltage).
3) I would not run the LED at its absolute maximum current rating.
4) The current in the resistor = voltage/resistance.

Because your charging battery is about 3V higher than 12V and your LED is about 3V then the resistor voltage is about 12V.

Ohms law is simple.
12 volts and 12 ohms is 1amp and 12 watts in the resistor.
12V / 120 ohms is 100mA = 1.2 watts
12V / 1200 ohms is 10mA =0.12 watts or 1/8 watt
 
5mm leds have usually maximum current of 20mA ( some could have 30mA )
If 3V led voltage is correct, the resistor should be (12-3)/.02 = 450 ohm.
Take nearest higher value of 470 ohm 0.5W resistor.
If max. battery voltage is 14V then 560 ohm should be ok.
 
A couple of additional comments. Different colors have different Vf's. Your DVM may have a diode test which you can use to measure Vf. 1 mA is probably really safe. 10-20 mA is a usual operating current. The package probably has power limits and you might be able to use that as an upper limit.
Once you have the LED lit, you can always measure Vf by putting the probes across the LED.
 
I have come to the conclusion that the LED is the most complicated electronic device ever invented :happy::happy::happy: judging by the number of queries about what is a suitable value for the series resistor.

All the info by Eric, Ron and jjw is valid, however...
...you want a quick and dirty rule of thumb to get you started with an unknown LED.

What I do when checking an unknown LED, I select a series resistor based on the available supply voltage.

Supply = 5volts, use a 470 Ohm resistor

Supply = 12 volts, use a 1K Ohm resistor

Supply = 24 volts, use a 2.2K Ohm resistor.

That will make the LED shine, it will not blow up, you can measure the voltage across the LED and make a reasonable calculation of the resistor value to give the current that you require in your application.

JimB
 
I use the same values as JimB as a general rule of thumb too :)

Grab an LED, whack a 1K resistor in series (on 9-12v) - if bright enough then job done. If not bright enough, measure the current then pick the next resistor down in the box if the current <20mA
 
Hi,

Are any LED spec's really "unknown"?

If ALL the LED specs are unknown, you have no recourse but to try possibly destructive testing. If you have several of them this isnt a problem, but if you only have one then you dont want to do that.

The most important spec is the forward current limit. If you know the max current it can take, then you can test for the voltage and that will tell you what resistor you should use unless you want to run it at a lower current and then you should test at a lower current and determine what voltage it drops at that current. Of course you need a voltmeter and know how to measure current.

If you dont know the current rating then you are stuck with estimating based on the size and shape. You'll notice that there are a lot of 20ma LEDs that come in the 5mm case size. But there are higher power LEDs that can take more current. Since the current is related to the die size, you can compare die sizes from LEDs that you do know the current specs of and see if they are similar. Remember however that many high power LEDs have to have heat sinks so you'll need a heat sink to run it at full power.
 
Thanks to all for the really helpful replies replies. It's much appreciated. I think the best thing for the future is always make sure I get a spec sheet for each device! I can then calculate the resistor accurately.

But just in the interests of my own education, I would like to ask the following (my ignorance of theory is terrible) and this leads on from Mr AI's "destructive testing" suggestion.

Am I right to assume that it's only the current flowing through the LED that matters? The voltage can be anything (above a certain minimum) as long as I use a resistor of sufficient value to slow the current down? In other words, it doesn't matter what voltage I use - 2.5, 12, 24 .... 220v etc as long as the current is controlled?

On the other hand, if I set my bench PSU to 3.0v, my LED runs fine even without a series resistor and even with the PSU's current setting cranked all the way to max. I assume that the LED is draining what current it needs on its own. Am I correct?

Many thanks again.

Trevor
 
On the other hand, if I set my bench PSU to 3.0v, my LED runs fine even without a series resistor and even with the PSU's current setting cranked all the way to max. I assume that the LED is draining what current it needs on its own. Am I correct?

No - you MUST add a current limiting resistor (or something else) to limit the current - attempting to run an LED from a voltage source is going to end in tears :D

The LED WILL blow relatively early, as you've no idea of what the current is, nor is it at all stable.
 
Thank you, Nigel. I got that loud and clear. I must know what the current is.

But when you say, "attempting to run an LED from a voltage source is going to end in tears", I suppose you mean "a voltage source where the current is not correctly limited"? There has to be some voltage, correct? Sorry if I'm taking you too literally.

No - you MUST add a current limiting resistor (or something else) to limit the current - attempting to run an LED from a voltage source is going to end in tears :D

The LED WILL blow relatively early, as you've no idea of what the current is, nor is it at all stable.
 
Hi,

We can only control one or the other, current OR voltage, not both at the same time. We choose to control current because it works better as the LED characteristics change over temperature and aging.

For example, for an LED current change of 1 percent the voltage might only change by 0.1 percent. Looking at it the other way around, if the voltage changes by 0.1 percent the LED current might change by a full 1 percent.
Control of the LED is about covering for imperfections in the LED while still maintaining a good operating point, so if we try to control the voltage we've got to control it to a within very small percentage, IF we control the current, it's bit easier.

Also, the LED characteristic voltage has a tendency to change, which means if it is run from a constant voltage source the voltage source will never know what characteristic voltage the LED has changed to. So if we set it for 3.3v and it draws 20ma that means the characteristic voltage is 3.3v at the time, but if it changes to 3.2v later that means it will draw more current, and because the current could change much more than the voltage change, that could mean the LED draws a lot more current now.

If we control the current though, the voltage will happily change from 3.3v to 3.2v but it will still draw just 20ma.

So the most important spec is the current spec, but we do have to make sure that our circuit can also deliver any voltage that the LED might need. If it needs 3.6v and our supply can only go up to 3.5v, then it wont be as bright.

One way to look at this is it is like powering an LED with a voltage power supply that has current limiting. IF we set the max voltage to 5v and the current limit to 20ma, the LED should run fine. We can not set the power supply to 3.3v with no current limiting though even if the LED draws 20ma *at the time*, because later it might try to draw much more current and then burn out. The power source in this case would be called a "current source", and that's the best way to drive the LED.

A series limiting resistor is used as an approximation to a current source. It's not as true as a current source, but it works in many cases and is cheaper and easier. It does have to be analyzed to make sure the LED can run with anything that changes to the LED and the power source.
 
Thank you, Nigel. I got that loud and clear. I must know what the current is.

But when you say, "attempting to run an LED from a voltage source is going to end in tears", I suppose you mean "a voltage source where the current is not correctly limited"? There has to be some voltage, correct? Sorry if I'm taking you too literally.

Yes, there has to be a voltage to generate the current - but what you want (in the best case) is a constant current.

Feeding it just from the power supply is providing it a constant voltage, and the current will vary massively with just a tiny change in voltage.

It's only ohms law, try a few examples - as in everything you have to make certain assumptions:

Although you don't have a resistor, there's resistance 'somewhere' limiting the current - but it's VERY low resistance.

So assume this 'resistance' is 0.1 ohms, and the voltage dropped across it 0.1V - this gives a current of 10mA - OK so far?.

Now assume the PSU drifts 0.1V higher (a VERY small increase), you now have 0.2V across the resistor (because the LED voltage is relatively constant), giving 20mA - a 200% increase in current for only 0.1V change.
 
What was neglected here was the max reverse voltage, which is often about 6V, thus operating at AC requires two LEDs to be placed in inverse parallel or a series diode must be used. It's not a good idea to run LEDs in parallel.

Use the diode mode on your multimeter.
 
Also, if all you are interested in is using a LED as an visual ON/Off indicator, say on battery-powered equipment, then you can operate most LEDs in a current-starved mode. I have used Red ultra-bright LEDs as an On indicator with as little as 250uA of current. Even at that low current, they are plainly visible in direct sunlight...
 
And to add to what Mike said, most modern LEDs are so bright that they are almost painful to directly look at when you run them at the full 20mA, so lately I tend to go for around 5mA for indicators etc.
 
What was neglected here was the max reverse voltage, which is often about 6V, thus operating at AC requires two LEDs to be placed in inverse parallel or a series diode must be used. It's not a good idea to run LEDs in parallel.

Use the diode mode on your multimeter.

You mean 5v :)
 
So bearing in mind the aim of my original question and all the excellent informative replies, it sounds as if all one really needs to do for a quick and dirty solution is connect the LED to the intended voltage with some high value series resistance and gradually reduce this till the LED is bright enough to be visible.


And to add to what Mike said, most modern LEDs are so bright that they are almost painful to directly look at when you run them at the full 20mA, so lately I tend to go for around 5mA for indicators etc.
 
Status
Not open for further replies.

New Articles From Microcontroller Tips

Back
Top