Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

What voltage can a common LED run at?

Status
Not open for further replies.

Lac

New Member
Would a LED rated to 2V 20mA, survive happy at 3V 20mA? How about 1.2V 20mA, would it light up just as much as it would if it was fed with 2V 20mA?

Chees!
Lac.
 
Lac said:
Would a LED rated to 2V 20mA, survive happy at 3V 20mA? How about 1.2V 20mA, would it light up just as much as it would if it was fed with 2V 20mA?

You can't feed things with a specific voltage and current like that, not even a simple resistor - if you fed a resistor 2V, and it took 20mA, feeding it 3V would make it take 30mA - simple ohms law.

However, an LED isn't a simple resistor, it's essentially a forward biased semiconductor - so you can't feed it a voltage, you need to feed it a current - THAT'S why you require a series resistor (to limit the current).

LED's require differing amounts of voltage to turn them on, 1.2V wouldn't be enough, 2V would do some, and 3V would do most - but you MUST use a current limiting resistor, applying 3V direct to an LED will probably instantly destroy it.

The brightness of the LED depends on the current through it, the voltage across it will stay pretty well the same.
 
hmm.. learned something new there, thanks! But if the voltage is ex. 18V will the current always be 180mA if I don't apply any load? 15V would give 150mA? 230V would give 2,3A, and so forth? I think I'm wrong, but that's what seems most logical to me right now.

Cheers!
Lac.
 
Idealy, but it still all goes back to "V=IR." Also, I've had some large white and blue LEDs survive currents in excess of 80mA, but this is very rare occaision and I wouldn't recommend it for other colors and sizes (it could cut a few thousand hours off your LEDs' life expectancy). Like Nigel says, you'll need a certain voltage to turn an LED on. 3volts seems to be a good minimum. Bill Bowden has a nice **broken link removed** on the web for series LEDs.
 
Lac said:
hmm.. learned something new there, thanks! But if the voltage is ex. 18V will the current always be 180mA if I don't apply any load? 15V would give 150mA? 230V would give 2,3A, and so forth? I think I'm wrong, but that's what seems most logical to me right now.

If you don't apply a load the current will be zero, current only flows into a load - without a load there can be no current.

If you have a power supply which is rated at 18V 1A, the 1A is simply the maximum current it can supply - if you put an 18 ohm resistor as a load it will draw 1A from the supply - ohms law, V=IxR, or rearranged for this example I=V/R. Likewise, if you use a 180 ohm resistor, the current drawn will be only 0.1A.

The examples you gave above all used a 100 ohm load.
 
hmm. Are there any connection with what voltage and current you get from a powersource? like a 4:3 ratio or something like that? Or does most powersources have a custom ratio between voltage/current? and if, how am I able to fine the current that a source outputs if I know the voltage? connect a load, and use a multimeter and ohms law?

Cheers!
Lac.
 
Lac said:
hmm. Are there any connection with what voltage and current you get from a powersource? like a 4:3 ratio or something like that?

No, there's no standard relationship between voltage and current from a power source, it depends completely on the individual design.

Basically it depends on the mains transformer (assuming the power supply uses one!). The voltage is dependent on the number of turns, and the current on the thickness of the wire.

Or does most powersources have a custom ratio between voltage/current? and if, how am I able to fine the current that a source outputs if I know the voltage? connect a load, and use a multimeter and ohms law?

It may be a problem with your English, but you mention "current that a source outputs", this terminology isn't correct. It should either be "current the load requires" or "maximum current the supply can provide".

To measure the first is simple, just place an ammeter in series with the load and measure the current. The second is more difficult to measure, the easiest way is to check the ratings of the components - if the transformer is rated at 1A, then the maximum rating is 1A (near enough). But the lowest rated component in the PSU would be the one to limit the current, a 1A transformer and a 100mA regulator would only provide 100mA maximum.

To actually measure it you would need to apply different loads, monitor the voltage and current, keeping a careful eye on the temperature of the various components - transformer, rectifier, regulator etc. There's always a chance that you could overload the PSU and damage something.
 
I frequently use 5 Volts. Note this is 'Pulsed' mode, not continuous. Setting a mark-space ration of something like on for 10% off for 90% @ say 1KHz etc. The led has time to recover, before the next hit. This gives loads of light, but pulls little average current from your source. An advantage of this method, the colour remains constant.

Just another variant...

Steve
 
Status
Not open for further replies.

New Articles From Microcontroller Tips

Back
Top