Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Resistors and LEDs....

Status
Not open for further replies.

limlik

New Member
Knowing what resistor to use is way to confusing for me. I thought I had figured it out. But then I started taking junk apart and trying to figure things out. I have run into a confusing issue. Almost every white 5mm LED I have used/seen requires 3.6ish V to 4.0 V max, but I have seen them with 5V going to it with just a resistor on it. Why isn't that LED burning out? Or is it the max current that matters?
 
Knowing what resistor to use is way to confusing for me. I thought I had figured it out. But then I started taking junk apart and trying to figure things out. I have run into a confusing issue. Almost every white 5mm LED I have used/seen requires 3.6ish V to 4.0 V max, but I have seen them with 5V going to it with just a resistor on it. Why isn't that LED burning out? Or is it the max current that matters?

The value of the resistor is chosen so that it limits the current thru the led to the specified value.

Typically 10mA to 20mA for standard leds.
 
Ok, but why does it say a max voltage and yet I see LEDs being used with a higher voltage?

The resistor drops the higher voltage down to the lower voltage for the led.

Example: say the white led is rated at 4V at 20mA and you have a 5V supply, you need to drop 1 volt across the resistor.

Ohms Law says R= V/I ,, Res= 1V / 0.02 = 50 ohms

So, subtract the led voltage from the supply voltage and divide the answer by the led current.. answer in ohms

OK.?:)
 
Last edited:
So hooking a 2.6v max red LED 60 mA to a 4.5v power supply I would need a?

A standard RED led would only handle about 20mA

So 4.5V - 2.6V = 1.9V so Resistor is 1.9v/0.02 = 95R say 100ohm
 
Isn't a standard red LED, misread is 50 mA, it is a full color LED.

All you have to do is:

So 4.5V - 2.6V = 1.9V so Resistor is 1.9v/0.05 = 38R say 39R
 
Ok, so before I go trying something dumb with these I want to double check that I am reading the info correctly. When the box says FW supply this means?
 
Ok, so before I go trying something dumb with these I want to double check that I am reading the info correctly. When the box says FW supply this means?

Do you mW milliWatts?
 
FW supply means the voltage drop across the LED measured from the posistive to the negative.

A LED act like a zener so for 2.0 Volts the LED will be bright.

When at 2.6 Volts the LED will be at it's maximum voltage and perhaps a little brighter.

It is always good practise to underrun the stated LED max. currents and max. voltages.

20 mA LED run at 15 mA
2.4 Volts across LED in above example.
 
Hi limlik,

may be you can use this little schematic for better understanding.

If the calculated resistor value is not a standard value, e.g. 170Ω, use the next higher standard value (180Ω) - never a lower one!

Boncuk
 

Attachments

  • LED-RLIMIT.gif
    LED-RLIMIT.gif
    8.9 KB · Views: 206
Last edited:
Always use the nominal voltage drop, not the maximum voltage drop and use the recommended operating current not the absolute maximum.

Round the resistor up to the next preferred value not down, especially if you're using a current close to the maximum. For example, if you calculate 50R, use 56R rather than 47R.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top