Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

LED limiting resistor

Status
Not open for further replies.

Othello

Member
If I have a constant voltage power supply, why use a limiting resistor with an LED?
Just set the supply to the exact Vf voltage.

Do I see this wrong?
 
It would be ok except the Vf goes down with temperature. The result is that the LED may burn up.

**broken link removed**
 
As the LED heats its Vs goes down which causes its current to increase which causes its temperature to increase which causes its current to increase which causes its temperature to increase which causes .... burn up.

Also an LED is not simply a length of wire that gets white hot like in an incandescent light bulb, it is a diode and they are not all the same. Some will have a very high current and burn out soon and others might not even produce any light.
 
Aha, I understand.
But a limiting resistor only dampens the effects you describe, it doesn't entirely prevent them, or not?
 
Aha, I understand.
But a limiting resistor only dampens the effects you describe, it doesn't entirely prevent them, or not?
True, but if the resistor voltage drop is significantly larger than the normal LED voltage variations then the variation in LED current is usually acceptable. To totally eliminate the effects of the LED voltage variation you can use a constant-current source instead of a voltage source to drive the LED. And for maximum efficiency you use a constant-current switched-mode supply.
 
Yes, true. But usually the spec gives you a minimum Vf at a given current. So if you size the resistor for this it will not operate outside the spec.
 
I would love to see your statements, especially crutschow's, laid out in in an example.

I have two LED chains (4 LEDs in series), one chain with Vf of 3.2V and I=30 mA for each LED.
The other chain with Vf of 2.5V at I= 20 mA.

Both chains are connected to the same 12V supply and make up one small circuit board.

Currently I connect one chain directly to the 12 V supply, figuring 4 X 3.2V, that is roughly ok.
The other chain is connected to the 12V thru a 100 Ohm limiting resistor.

Given your reasoning I should increase the power supply voltage, adjust the 100 Ohm resistor and add another limiting resistor for the 3.2V @ 30 mA LEDs!?!

A constant source power supply would not work in my situation, I think, since I use many of these boards, switched on and off rather randomly, so the load varies widely.
And the power source has to accommodate this widely changing current.
 
Ahh yes, I see.
Vf also has a curve. At low current Vf is lower. So your LEDs attached directly to the 12 volts are running at a much lower current than you might think they are. Perhaps 5 or 10 ma instead of 30. If they are bright enough for you this will work.
 
Here is a simulation of your condition at various temperatures. As you can see the diode with the limiting resistor has a variation but is much brighter than your LED. But yours has a wider variation with temperature.
 

Attachments

  • led res.png
    led res.png
    69.2 KB · Views: 167
Well ronv, thank you for that simulation.
What I gather from that is, that the use of the resistor 'smoothes' the conditions when the temperature varies.

For me the question remains how to find the optimal resistor.
And, like in the case where I could use 4 LEDs in series to be powered from a 12v supply, if it would be advantageous to go, lets say, to a 14v power supply just to employ a limiting resistor?
 
Did you consider that you cannot buy a "3.2V" LED? Unless you buy a few thousand LEDs, measure them all then sort them you get whatever they have which might be 2.8V to 3.5V.
Even if you bought a few thousand LEDs most might be near 2.8V or 3.5V because each production run is different.

So if the datasheet says its minimum Vf is 2.8V at the current you want then use that to calculate the value of the current limiting resistor because its current will be the highest and a 3.5V LED will have the lowest current.

You talked about 4 LEDs in series with a resistor. With a supply of 12V and LEDs that are all near 2.8V then the resistor gets 12V - (2.8V x 4)= 0.8V so for 20mA its value is 40 ohms and you can use 39 ohms which is a standard value.
But if the LEDs are all higher than 3.0V then they will not light.

If the supply is 14V and the resistor is calculated for 20mA in 2.8V LEDs then LEDs that are 3.5V will be extremely dim.

If the supply is 16V then LEDs that are 3.5V will light but will appear to be dim compared with 2.8V LEDs if both groups use the same resistor value. With 2.8V LEDs the resistor will have a value of 240 ohms for 20mA. But with 3.5V LEDs the current is only 8.3mA.

Try an 18V or 20V supply for the 4 LEDs.
 
I've written a calculator that works out how many LEDs you can put in series, and the resistor that you need, given the range of current that is acceptable.

https://mtrak.co.uk/led_calculator.html

Once you have allowed for a reasonable supply and LED voltage tolerances, it isn't possible to get a tight control of the current unless there is quite a lot of voltage across the resistor.
 
Status
Not open for further replies.

Latest threads

Back
Top