Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

How to measure LED Forward Voltage?

Status
Not open for further replies.
Hi all,

I've got some LEDs that I've been advised to measure the Forward Voltage on as the Spec sheet gave a wide variance (1.8-2.4).

Using an online calculator I worked out that I would need a 560ohm resistor for a single LED from a 12v source. I've put in the calculator 20mA as the current.

So my question is, what figures should I expect to see from the mulit meter when I measure before and after the LED and Resistor?

Thanks,
Ant
 
Think of it this way. Vf, the voltage drop for the LED is fixed but you do not know what it is.

Your supply is 12V. The resistor's voltage drop will be 12V-Vf. This makes sense because the sum of the voltage drops for the resistor and LED must equal the supply voltage.

So when you place your probes on each side of the led you are directly measuring Vf that that specific LED.

HTH
 
Last edited:
So should I place the resistor before or after the LED? I've seem schematics with both and always thought it should go before the LED.

Does the resistor take a "set voltage" away from the LED, or just what is left after the LED has had its voltage? Sorry to sound so simple i'm just trying to get my head round it in a simple manner.

Thanks,
Ant
 
The resistor can go on either side of the LED, doesn't generally matter.

An LED won't take exactly 1.8v, or 1.984v, or 2.2244821v... it will vary with current, however, it won't vary much. Assume that the voltage is on the low end of 1.8v. Determine the current the LED expects (most LEDs are specced for 20mA though its common to drive them at 10mA to increase life) or that you want to drive it at, then choose a resistor based on (Vsupply - Vled)/I = R. Then, measure the voltage across the LED and you have that LEDs typical voltage. It'll vary slightly if you change the resistor or the current....

If you double the voltage on a resistor, you double the current through the resistor. If you double the voltage on an LED thats already on, you'll get massive current (for a short time).

It is also common to use a current source instead of a voltage source to drive LEDs, then you don't need to worry about the voltage as much.
 
So should I place the resistor before or after the LED? I've seem schematics with both and always thought it should go before the LED.

Does the resistor take a "set voltage" away from the LED, or just what is left after the LED has had its voltage? Sorry to sound so simple i'm just trying to get my head round it in a simple manner.

Thanks,
Ant
Noggin is correct. You can put the resistor either prior to or after the LED. But the short lead of the led (cathode) must either go to ground or through the resistor to ground.

Ignoring the slight differance as indicated by Noggin. The LED will only drop its fixed Vf voltage. The sum of the voltage drops around the circuit must be equal to the supply. So any voltage left must be handled by the resistor.

With a 12V source and a LED with a Vf of 2V the resistor will need to drop 10 volts.

If we want 10 mA flowing through the LED we determine what resistance is required using ohms law. Give it a shot.
 
Ok I think I'm getting it. But I don't think I'll see what I'm thinking I'll see. i.e, that this LED is 2v, the next one is 1.8v and so on.

I think I'm trying to be too precise with my calculations, when its very difficult to be because of the range of values for the LEDs.

I'm using the following in the calcs:
12v supply
2v Vf (on LED) (given range of 1.8-2.4 on spec sheet)
20mA (on LED) (given range of 20-30 on spec sheet)
3 LEDs (in Series)

This is giving me the output of:
Calculated Limiting Resistor: 300ohm
Nearest higher rated 10% resistor: 330ohm
Calculated Resistor Wattage: 0.12watts
Safe pick is a resistor with power rating of: 0.2watts

Now, my understanding at the moment is that the 330ohm resistor will give me less than 20mA through the LEDs because I actually need a 300ohm resistor? Is that right?
And that this calculation is only correct IF the LED is 2V (Vf). Am I coming along the right path here?

Thanks, and sorry for all the questions.
Ant
 
With a 12V supply and LEDs that have a max forward voltage of 2.4V then four LEDs can be connected in series. Then if the LEDs are all actually 2.0V the current-limiting resistor is (12V - 8V)/20mA= 200 ohms.
I have not seen a 10% resistor since 50 years ago. 200 ohms and 300 ohms are common 5% resistor values.
 
Just copying what it said on the screen.

I think what I'm struggling with at the moment is a way of proving what the actual Vf is of the LEDs. Maybe I'm hanging up on this when I needn't be.
 
The reason for measuring the forward voltage then sorting the LEDs is because you want each string to have the same current so their brightnesses are the same.

With a 12.0V supply:
Four 1.8V LEDs in series is 7.2V. The resistor value for 20mA is 240 ohms.
If the four LEDs are all 2.4V then their total voltage is 9.6V and the 240 ohm resistor will give a current of only 10mA and they will look dim.

Maybe you should mix the LED voltages in each string so that the total voltage is the same for each string then the resistor value can be the same for each string.
 
The reason for measuring the forward voltage then sorting the LEDs is because you want each string to have the same current so their brightnesses are the same.

With a 12.0V supply:
Four 1.8V LEDs in series is 7.2V. The resistor value for 20mA is 240 ohms.
If the four LEDs are all 2.4V then their total voltage is 9.6V and the 240 ohm resistor will give a current of only 10mA and they will look dim.

I think thats what my worry is about. I can't see them being that far apart though personally. I think I just got the willies put up me by my other topic. I will be having a toy around with these tonight and hopefully have a better understanding of them by the time I've finished.
 
ok so I hooked 1 LED up with a 560ohm Resistor, and measured a Forward Voltage of 2.1v

I tried hooking another up with a 1kohm Resistor, expecting to see a big drop in light output, but there was very little noticable difference. Bit confused at that... could someone explain what sort of ohm rating I would expect to need to see a real dip in output from the LED?

Many thanks,
Ant
 
I have noticed that some LEDs seem to look the same brightness over a pretty wide spread of current.

The best way to compare them is run several side by side at various currents.

Pick the lowest current that is bright enough for your needs. It will be easier on the LED and it will last longer.

How many volts are you using to test them?
 
ok so I hooked 1 LED up with a 560ohm Resistor, and measured a Forward Voltage of 2.1v

I tried hooking another up with a 1kohm Resistor, expecting to see a big drop in light output, but there was very little noticable difference. Bit confused at that... could someone explain what sort of ohm rating I would expect to need to see a real dip in output from the LED?

Many thanks,
Ant

In fact, when you used 1K the led had about one half of the current and gave about one half of the light output.

The problem is that your eyes are not a linear measuring device but they "see:D" a logaritmic scale.

It is rather difficult to see a 2 to 1 difference in light (in photo terms, one f stop)

It's easier to compare two leds with different currents if they are lit at the same time and placed side by side.

EDIT: Andy beat me with the answer
The best way to compare them is run several side by side at various currents.
 
Last edited:
LEDs tend to be more efficient at lower currents and the logarithmic response of the eye means that the power has to decrease by a factor of four to be half as bright.

Don't use LEDs at the absolute maximum current rating.

Another trick is to size the resistor expecting the lowest voltage drop and the highest supply voltage and you'll never overpower the LED.
 
I'm going to buy a few different resistor packs and see what does what to give me the required light output.

I had them both running at the same time on the same piece of strip board (seperate circuits) - so it was clear to see the little (if any) difference between them.

Glad that I've made some headway though. I will continue to have a play around with them and various resisitors.

Oh, and I'm using a 12v Battery Charger for the power supply while I'm working on a bench.

For those that don't know, these are going to be used for a set of rear lights on a car. So the low out put doesn't really need to be that great compared to the full (20mA) output when the brake light circuit is made live.
 
Police give tickets to cars with customized lighting when it is too dim.
 
I'm not talking dim as in hardly visible. All the lights will be tested against existing lights to ensure they are as close to or better than the existing ones. I'm not doing this to be illegal or dangerous.

And I'll be most annoyed if I get pulled over for my lights being different when there are so many buckets of sh!t on the british roads.
 
In Canada it is illegal to modify the exterior lighting on a car.
But many stores sell dim LED Chinese "lighting upgrades".
 
Status
Not open for further replies.

Latest threads

Back
Top