Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Lighting up some LEDs (total newb)

Status
Not open for further replies.

myshtern

New Member
Hey guys, I'm very new to electronics so any help would be much appreciated.

I've got a several big LEDs that I want to light up. Here is what their ratings are: 2.4V/20mA

I'm going to use an old computer's power supply to get steady 12v DC power from a wall outlet. It's got 12v+ and 12v- pins and that can power up to 200watts so about 16amp capacity.

If I wire 5 of those LEDs in a series, (2.4 x 5) = 12v. Do I not need any resistor? Doesn't the current need to be regulated? That's the part I don't understand.

Thank you!
 
What color are they? Yes no matter what the current needs to be regulated.
 
A resistor is the easiest way, check out this online calculator.
**broken link removed**

Keep in mind you can't regulate the current if the resistance is too low, for good measure pick a number of LED's that will result in a resistor around 100ohms.

I put in 4 LEDs with 2.4V voltage drop at 20ma with an input of 12 volts = 120ohm resistors for a 4 led series. .048 watts dissipated in the resistor. Watch that disipation value carefully because high disipation resistors are two things, expensive and wasteful. Common cheap carbon resistors don't go much bellow 1/8 of a watt. I would recommend using thin film resistors, not as cheap but much more stable.
 
Last edited:
Series resistors limit current. They cause a voltage drop as well. For the current regulation of an LED it's the current limiting effect that's important.
 
Last edited:
Okay but how do I figure out what kind of resistor I need based on my input current instead of my input voltage?

I'm using a PSU that has 12v outlets but it can output 10-20amps on those outlets. Do you guys see what I'm saying?
 
Okay but how do I figure out what kind of resistor I need based on my input current instead of my input voltage?

I'm using a PSU that has 12v outlets but it can output 10-20amps on those outlets. Do you guys see what I'm saying?

Yes, voltage is 12V, current is sufficient for your requirements - that's all you need to know.

Use ohms law to calculate the resistor value based on the current you want.
 
So basically, the LEDs draw whatever current they need, it's not like the PSU is trying to feed them 10amps. Correct?
 
So basically, the LEDs draw whatever current they need, it's not like the PSU is trying to feed them 10amps. Correct?

No, the LED drops a (reasonably) constant voltage across itself - and will be instantly destroyed - UNLESS you limit the current through it, the simple way been a current limiting resistor.
 
Look at this curve to see what Nigel is talking about. This is for a common diode, the curve is valid for all common diodes LED's included, the voltage the forward and reverse voltage are different for LED's though. Notice the sharp curbe up in current with increased voltage? That's why current needs to be regulated. Small increases in voltage can dramaticall increased the forward current, and the diode will self destruct with more current than it's max forward for a VERY brief period.
**broken link removed**
 
Last edited:
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top