Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Is this right? just checking my knowledge =D

Status
Not open for further replies.

Lac

New Member
This is how I have understood the way voltage drops work: When you ex. have a +5V voltage source, and connect a LED with a drop of 1.8V (rated at 15mA) the voltage after/behind the LED will be 5-1,8=3,2V so you'll need a 220ohm resistor arcording to Ohm's law to make the LED light at the desired brightnes.

So if you connect another LED in serie with that LED the voltage will drop further 1,8V so behind the two LEDs the voltage will be 5-1,8 -1,8=1,4V and you need a resistor at 100ohm.

If you connect a third LED, the voltage will drop to zero and the LEDs won't light up.

This right?

Cheers!
Lac.
 
You're almost on track... in fact, the first two suppositions are OK. The problem is when you get to the third LED in series.

3 LED's, assuming that they are all the same, would present an effective circuit resistance of about 360 ohms (15mA @ 1.8V = 120 ohms). Thus, the three of them would yield a calculated circuit current of 13.89mA, and all would illuminate, but with somewhat less than maximum brightness.

The problem here is that there is no real safety in this circuit, as there is no passive device in place for current limiting.
 
Let's look at two of these LED's in parallel...

The stated design current for these LED's (per your OP) is 15mA with 1.8V forward bias. In a parallel circuit, the total circuit current is the sum of the currents in all branches of the circuit. Thus, two of these LED's in parallel should draw 30mA. If we use a single current limiter, we can calculate the value thereof as 3.2V/0.030A or approximately 100 ohms.

The parallel LED's will each drop the normal 1.8V on its own branch circuit, and the 30mA current will split between the two LED's. The current limiter must then be based upon a 3.2V drop across the limiter with 30mA current through it.
 
Hi,
NEVER connect LEDs directly parallel!
ChrisP is right as long as the Vf of both LEDs is exactly the same.
Due to tolerances in production Vf is always slightly different.
In that case one LED will take the current nearly completely, what leads to it's destruction sooner or later.
If that happens the remaining LED takes the whole current and will be destroyed too.
The solution:
Make strings of LEDs and resistor connected in series.
Then You can use as many of these strings parallel as You need without danger for the LEDs.
hth
regards
joachim
 
Thus, the three of them would yield a calculated circuit current of 13.89mA, and all would illuminate, but with somewhat less than maximum brightness.

When I test this in my breadboard, it works nice with 1 and 2 LEDs, but when i put in a third it won't light up. :?

NEVER connect LEDs directly parallel!
ChrisP is right as long as the Vf of both LEDs is exactly the same.
Due to tolerances in production Vf is always slightly different.

For what reason would the other LED take nearly the whole current if there is a little difference in Vf between the two LEDs?

I love in depth info :D

Cheers!
Lac.
 
Lac said:
Thus, the three of them would yield a calculated circuit current of 13.89mA, and all would illuminate, but with somewhat less than maximum brightness.

When I test this in my breadboard, it works nice with 1 and 2 LEDs, but when i put in a third it won't light up. :?

It won't because three LED's in series (at 1.8V each) require 5.4V, so you don't have enough voltage for them to turn on. Chrisp was incorrect when he suggested it would work.

NEVER connect LEDs directly parallel!
ChrisP is right as long as the Vf of both LEDs is exactly the same.
Due to tolerances in production Vf is always slightly different.

For what reason would the other LED take nearly the whole current if there is a little difference in Vf between the two LEDs?

I love in depth info :D

Try punching two holes in the side of a tin can, one slightly higher than the other - now slowly poor water into the can - it all runs out the lower hole before it can reach the higher hole.

It's exactly the same with two LED's in parallel - and I agree with everyone else, you shouldn't do it!.
 
Nigel Goodwin said:
It won't because three LED's in series (at 1.8V each) require 5.4V, so you don't have enough voltage for them to turn on. Chrisp was incorrect when he suggested it would work.

If the 1.8V forward voltage is the minimum value, then I agree that they will not light with three in series. However, if that Vf is a maximum value -- or even a typical value, then I would expect that three of them in series would light when powered by a 5V source. :)

As to the wiring of LED's in parallel, my response was more of a theoretical response to the voltage drop question, rather than an actual recommendation to connect them that way...
 
Status
Not open for further replies.

New Articles From Microcontroller Tips

Back
Top