Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

LED Current Limiting Resistor

Status
Not open for further replies.
So bearing in mind the aim of my original question and all the excellent informative replies, it sounds as if all one really needs to do for a quick and dirty solution is connect the LED to the intended voltage with some high value series resistance and gradually reduce this till the LED is bright enough to be visible.

Pretty well :D

However, I would (not 'quite' as crudely :D) just calculate roughly ignoring the LED voltage entirely.

So 12V supply, you want 10mA - so simply 12/0.01 = 1200 ohms.

Obviously the actual current will be less than 10mA, and is more inaccurate with lower voltages - but for a simple 'connect and make sure it lights up' it's 'near enough' :D
 
By current-starved you mean simply using a resistor value higher than what would be needed to achieve the specified max current rating of the LED. So basically, start with a high value and reduce it till the light intensity looks about right?


Also, if all you are interested in is using a LED as an visual ON/Off indicator, say on battery-powered equipment, then you can operate most LEDs in a current-starved mode. I have used Red ultra-bright LEDs as an On indicator with as little as 250uA of current. Even at that low current, they are plainly visible in direct sunlight...
 
"about 6V" can be considered 5 V. That's what I get for not looking up the specs.

Hi,

He he :)

This is one of those spec's i like to be very careful about because even 5v might be too high and if they do spec 5v max then we really have to respect that. That might be similar to say 5v TTL where i'd never want to run at 6v or "about 6v" i'd want to run at very nearly 5v, plus or minus 5 percent. Adding to that, most of the LEDs i see do state 5v max reverse bias, and it always seems to be 5v if there is a spec at all. That makes it easy to remember too :)

LEDs are still evolving. Some LED's are not even spec'd for reverse bias these days, they just state that "LEDs are not made to be reverse biased", or something like that. To me that means dont even reverse bias at 1v.
 
So bearing in mind the aim of my original question and all the excellent informative replies, it sounds as if all one really needs to do for a quick and dirty solution is connect the LED to the intended voltage with some high value series resistance and gradually reduce this till the LED is bright enough to be visible.

Hi,

That's a sure fire way to do it every time :)

As others have pointed out, you can calculate a rough estimate for the resistor based on the power supply voltage, but there are times when this might not work very well and that is when the power supply voltage is comparable to the LED voltage. The best bet then is to assume the LED voltage is the lowest possible.

For example, if we take the lowest LED voltage to be 1.5v and the highest to be 3.6v, we can do two calculations knowing these two voltages (and set current of 10ma):
R1=(Vs-1.5)/0.010
and
R2=(Vs-3.6)/0.010

So lets say the power supply voltage Vs is 20v, we get two values:
R1=(20-1.5)/0.010=1850 ohms
R2=(20-3.6)/0.010=1640 ohms

We can see that the value of the resistor will always be between 1640 and 1850 ohms. If we start with the higher value, we should be ok, then we can check the current and decrease if needed.

But lets say the voltage is only 10v, we get two values again:
R1=(10-1.5)/0.010=850 ohms
R2=(10-3.6)/0.010=640 ohms

and now we see that it would be a good idea to start with 850 ohms.

But what if the voltage is only 5v, we get two again:
R1=(5-1.5)/0.010=350 ohms
R2=(5-3.6)/0.010=140 ohms

Here we see that we should start with 350 ohms, because if we start with 140 ohms we'll get a current that is too high if we happen to have an LED with a forward voltage near 1.5v:
I=(5-1.5)/140=0.025 amps.

but even that wont blow out the LED if it is rated for 20ma.

When the voltage gets REALLY close to the LED voltage that's a different story as with 4v:
R1=(4-1.5)/0.010=250 ohms
R2=(4-3.6)/0.010=40 ohms

From this we can see that a good starting value is (Vs-1.5)/I where Vs is the power supply voltage and I is the desired current.

Also worth mentioning is in a battery application we might want to calculate the current with a new battery and with a battery that is run down a bit so we know how much brightness we will loose as the battery runs down over time.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top