Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Simple LED voltage drop question

Status
Not open for further replies.

bigal_scorpio

Active Member
Hi Guys,

I was just wondering if there is any advantage with using a diode(or some) to drop the voltage to LEDs or do you get the same power waste as with a series resistance?

Any thoughts?

BTW I want to drop 5v to 3.5v at 300mA for a 1w white LED.

Thanks for looking.............Al
 
LEDs are driven from a certain amount of current, not voltage.
The forward voltage drop of an LED is not exact, it is a range of voltages. It might be 3.0V and if you feed it 3.5V then it will immediately burn out. It could be 4.0V and if you feed it 3.5V then it won't light or it will be very dim.
But if you feed it 300mA it lights perfectly.
 
Adding at least one diode in series could be a benefit if there is a chance the the polarity of the power supply is ever reversed. White LEDs do not tolerate reverse voltage well and having a silicon diode (Schottky or junction) in series with your LED would protect the LED in this situation.

Also, adding a series diode to the resistor-white LED series circuit would reduce the power dissipation in the resistor. But be careful, forward voltage (VF) of both the LED and silicon diode are temperature dependent. As temp increases, VF decreases so this would tend to increase the LED current at higher temps. Just make sure to choose the resistor value carefully considering temperature extremes and LED current rating.

Jeff
 
Hi there,


Anything you use to drop the voltage will lose quite a bit of power,
except for a high efficiency switching buck regulator.

Lets say we use a Schottky diode with 0.4v drop, and lets say
that we got incredibly lucky that our voltage source was exactly
0.4v higher than the LED drop at it's rated current (this will almost
NEVER happen in real life BTW). And, lets say that our LED drops
exactly 3.6v in this circuit, and draws 100ma.
The power getting to the LED is 0.1 times 3.6 which equals 0.36 watts,
and the power coming from the battery is 0.1 times 4.0, which equals
0.40 watts. So, we are putting out 0.40 watts and only using 0.36
watts, so we are wasting 0.40 minus 0.36 which equals 0.04 watts,
and that may not seem like much but it is after all 10 percent of the
total power being wasted. It's interesting in that if we chose a resistor
that dropped exactly 0.4 volts we would end up with the same power
loss; no difference.

But that circuit is never going to happen anyway because the LED
cannot be driven from a voltage source though a diode because the
diode drop is too 'hard', while a resistor is 'soft', so to speak.
The resistor allows the LED to run near its rated current and allows
for a little wiggle room in the LED voltage drop, while the diode will
not allow this.

So, the trick is to pick a resistor that matches the LED and the
voltage source. To do this we have to calculate a few things
like power supply swing and diode voltage swing, and do a min
and max calculation on the current in the diode. If it works out,
we've got the right resistor value.

If you need more information on choosing the right resistor for
your LED, you will have to supply some more info like what
kind of LED it is, and what kind of voltage source you will be
using with it.
 
Last edited:
Hi Guys,

The LED I'm using spec is this:-
Lens Colour :Water Clear
Emitting Colour:Warm White
Intensity Type:50~60Lm
Viewing Angle:120°
Forward Voltage :3.1V~3.3V
Forward Current:300mA
Colour temperature:3000k

The supply would be 5v from an LM7805 pos regulator.

I will be probably running the LED on PWM at 95% max but I'm not certain if I'm going to implement that yet, it depends how bright the LED is when I test it as to if I will need any dimming.

So any suggestions as to the best and most exact way to run the LED would be welcome.

Al
 
For the best efficiency you would use a PWM controller operating in a regulated current output (300mA) mode. For that you don't need the 7805 regulator. They make PWM controller chips specifically for this purpose.

If you don't mind wasting about ½ watt of power, then just use a resistor ( a variable pot in series with a fixed resistor if you want dimming) in series with the diode.
 
High power LEDs are supposed to be driven with a current regulator. You don't have enough headroom voltage for one.

Assuming that 300mA is the absolute max allowed current, the 5V regulator might be 5% high at 5.25V and the LED might be low at 3.1V. Then a current-limiting resistor will be (5.25V - 3.1V)/300mA= 7.17 ohms. A 7.5 ohm resistor wiull provide 287mA and if its value is 5% low then its current will be 302mA.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top