Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Do resistors consume/waste electricity while in series?

Status
Not open for further replies.

joecool85

Member
I was always under the presumption that if you had a circuit for an LED and a series resistor to drop voltage, you would be wasting power by heating the resistor. That said, I was looking for an LED that would consume less than 5ma for a project of mine and was going to buy some 1ma LEDs when someone told me about just using a bright LED and a larger resistor. I did some testing and this is what I found.

8.65v source voltage at battery. Red LED with 268 ohm resistor in series, 2.21v at LED and 22.6ma draw on circuit (measured between the battery and the resistor). Same Red LED but with 1,256 ohms of resistance in series, only 1.93v at LED and only drawing 5.2ma in the circuit. I could have gone a little higher on the resistor because the LED was still bright enough for my purposes.

So, do resistors consume or "waste" electricity when in series, or only when in parallel from V+ to ground (as in a voltage divider)?
 
The power wasted by a resistor is the voltage across it times the current through it.
The wasted power can also be calculated by the voltage across it squared divided by the resistor value, or by the current in it squared times the resistor value.
 
The power wasted by a resistor is the voltage across it times the current through it.
The wasted power can also be calculated by the voltage across it squared divided by the resistor value, or by the current in it squared times the resistor value.

What do you mean the "voltage across it"? I measured with my multimeter expecting that since the battery was 8.65v and the LED was receiving 2.21v I would see
6.44v, but I didn't. I saw 0v. Maybe I had a bad connection.

If it was 6.44v, that would mean according to your math that the resistor is using (6.44v x6.44v)/268 ohm = ~0.15 ma

Is that right? I suppose I could verify that by checking current flow between the resistor and the LED...
 
You are mixing up current and power.
The current in the resistor is 6.44V/268 ohms= 0.024A which is 24mA. The LED will be bright.
The power in the resistor is 6.44V squared/268 ohms= 0.155 Watts which is 155mW. A 1/4W resistor will be very warm.
 
Does this mean that since the LED was operating at 2.21v it was consuming ~50mw while the resistor was consuming 6.44v @ 22.6ma meaning ~146mw for a circuit total of 196mw?

And if that is the case then when I added the other 1k resistor that means the LED was consuming 1.93v @ 5.2ma or 10mw while the battery was consuming the rest of the voltage, 6.72v also at 5.2ma meaning ~35mw, for a grand circuit total of 45mw? If so, it is indeed true that the added resistor does cut down the overall needed power from the battery, but the resistor is chewing up more than the LED.

So, the grand question, is it more efficient use of power to have a ultra bright LED running at 1ma via a high resistance resistor, or by using a 1ma LED with a "regular" sized resistor? I'm not sure on what values each would take since I don't have them to try them myself.
 
Last edited:
Running an LED at 1mA from a given voltage requires a certain value of resistance. Any change in resistance from that value will change the current (and consequently the power). So your question about a "regular" or "high resistance" resistor for operating an LED at 1ma makes no sense.

The only ways to reduce the resistor power for a given LED current are:

1) lower the supply voltage

2) use a high efficiency switching current regulator to drive the LED (which requires no resistor)
 
An LED that is "super bright" is usually just an ordinary LED in a case that focusses all the light into a narrow beam. If it doesn't shine directly in your direction then it might not be seen.
 
An LED that is "super bright" is usually just an ordinary LED in a case that focusses all the light into a narrow beam. If it doesn't shine directly in your direction then it might not be seen.
There are a class of "Super Bright" LEDs that are indeed more efficient and put out more total light at a given operating current than the standard variety.
 
Running an LED at 1mA from a given voltage requires a certain value of resistance. Any change in resistance from that value will change the current (and consequently the power). So your question about a "regular" or "high resistance" resistor for operating an LED at 1ma makes no sense.

The only ways to reduce the resistor power for a given LED current are:

1) lower the supply voltage

2) use a high efficiency switching current regulator to drive the LED (which requires no resistor)

Sorry, it was muddled. What I mean is what is the difference between using an "ultra bright LED" that is listed as consuming 20ma with a resistor that brings it down to 1ma draw as compared to using a "regular" LED that is listed as 1ma and a resistor as appropriate to keep it at 1ma?
 
Last edited:
actually the "efficiency" of an LED is measured in millicandles per milliwatt or in power LEDs lumens per watt. many ultra-bright LEDs (13000mcd@30mA(and a 1.5V drop)) are indeed high efficiency LEDs. a 13000mcd LED running 30mA at 1.5V would rate at 289 mcd/mW and would be brighter than a LED rated at 100mcd@10mA (again at 1.5V would yield 6.67 mcd/mW), and would be brighter at any equivalent power. so if each were run at 1mW of power, the first would be emitting 289mcd, the other 6.67mcd.
 
Sorry, it was muddled. What I mean is what is the difference between using an "ultra bright LED" that is listed as consuming 20ma with a resistor that brings it down to 1ma draw as compared to using a "regular" LED that is listed as 1ma and a resistor as appropriate to keep it at 1ma?
If the are both operating at 1mA then then total power drawn from the power supply would be the same, however the brightness could be significantly different as noted by unclejed613.
 
To further add onto what unclejed613 said.

LED's have different efficiency for different models to be sure. They also have different efficiency when operated at different currents on a part per part basis. And that efficiency is not necessarily linear. For example, if you had a 10ma LED, you may get some thing like...

10mA x 1.5v = 15mw = 100mcd (or 6.6mcd/mw)

However, if you cut the current in half you might get something like...

5mA x 1.5v = 7.5mw = 40mcd (5.3mcd/mw)

So, even though the light output and current consumption are lower and your using the exact same LED... the efficiency is actually a bit lower because the light output is less than half like you might have expected. This might explain the data the OP mentions in post #3.

The labeled mA rating for any given LED is *USUALLY* the point of highest efficiency. But it can also be the parts maximum allowable current before destruction. And it can even be both. It all depends on the LED. Read the datasheet, there should be a graph displaying the current vs the light output. In any case, I would say that changing the value of the series resistor shouldn't change the efficiency of the whole circuit just do to the change in power dissipated in the resistor alone. It should change the efficiency only because the LED's efficiency would be changed with a change in current. That doesn't mean changing the resistor doesn't change the power consumed, just that the light produced should more or less match the power consumed. (in accordance with the LED's efficiency of course.)
 
Last edited:
for most LEDs the current vs luminousity curve is almost linear. but as the junction temp goes up with current, the efficiency drops a lot. most LED data sheets show two curves, one of lumonousity vs current, and one of luminousity vs junction temperature. if the LED is not heatsinked in some way, the junction temperature will climb with current, causing the top end of the curve to be very nonlinear. at the currents you are looking to drive the LED at (1mA), i don't think the thermal effects will matter much. if you were to drive the LED near it's maximum current, then you would definitely want to look at some way of getting rid of the heat.

going back to the original question, if you have a LED in series with a resistor across a 9V battery, the resistor dissipates heat in much larger proportion than does the LED, because it has a voltage drop of 7.5V across it and the LED drops 1.5V, with the same current. for a 1mA current, the resistor would be 7500 ohms (7.5V/1mA). the resistor actually should be a little higher in value, because the LED forward voltage vs current curve is VERY nonlinear, so that the actual voltage drop across the LED might only be 1.2V. another way to approach this problem might be to construct a transistor constant current source, and set it to 1mA. this would provide 1mA no matter what the battery voltage is (at least until the battery voltage drops too low to keep the transistor properly biased)
 
Last edited:
"for most LEDs the current vs luminousity curve is almost linear"

Yeah... I may have actually been thinking of forward voltage drop vs current curve. I just misspoke. I'll leave it as is, people will just have to read this far to get the correct information.

Anyway, as for the OP's question. What I *THINK* he's asking is...

"Does JUST the value of the resistor have an effect on the total efficiency of the circuits ability to convert electricity to light, if all other factors remaining unchanged?"

And as far as I know (assuming we can't change anything else about the circuit) the resistors value it's self does not effect the efficiency directly, only through the cause and effect it has on the circuit. Keeping in mind that we are fully aware that dropping the resistance has the effect of increases the current, and thus the power draw and light output. Also knowing full well that in the circuit discussed, the resistor has the biggest portion of the power being dissipated.

In other words, changing the resistance only effects efficiency because it effects the current through the total circuit, If it effects the efficiency at all.

Case in point. You can increase the efficiency of the circuit by starting the operating voltage closer to the voltage drop of the LED, then you would be using a smaller resistor for a given current/light output. This works because the proportions of power loss are shifted more onto the LED and less to the resistor. But changing the resistor alone will not have this effect because it changes the current through it's self as well as the current through the LED. So the power dissipation RATIO would remain the same no matter what you do to the resistor. You have to change the supply voltage or the LED along with the resistor to otherwise unbalance the equation. Or figure out a way around that pesky first law of thermal dynamics.

*HOWEVER* based on LED forward voltage drop vs current curve I would say yes it can actually have an effect on efficiency under certain scenarios.
 
Wow, you guys are quite helpful. So, for this specific use it is an indicated LED and merely needs to light up (even a little is plenty really) so as long as I can make even a generic LED light up enough with 1ma it will be the same (for my purposes) as an LED that is rated for 1ma.

Also, I'm glad that I was right in my head that the resistor is wasting power so if you are for instance powering a string of 6 LEDs that need 1.6v each on a 12v power source it is best to put them in series and have a resistor drop the remaining 2.4v as opposed to having them in parallel and having a resistor drop 10.4v.
 
What is it that your trying to accomplish? Just one light for an indicator or a string for lighting or some kind of decoration? Odds are some here can help or make some suggestions on how to.
 
a few years back i used to make and sell LED flashlights that used a white LED mounted on a 9V battery clip with a resistor and reverse voltage protection diode. the components and battery clip were sealed in hot-glue. the whole thing including the battery fit in a plastic 35mm film can. the device was made for an emergency survival kit. battery life was a few days of continuous use. current was 5mA. using the brightest CREE white LEDs available at the time, it was bright enough to read a map with, or see ones way through a dark house.
 
What is it that your trying to accomplish? Just one light for an indicator or a string for lighting or some kind of decoration? Odds are some here can help or make some suggestions on how to.

It is the "on" indicator on a guitar effects pedal. I don't want it wasting more power than need be because I run it off 9v batteries and don't want to add a power jack to my design. The effect only draws 0.6ma on it's own but the current LED draws over 20ma. I want this much lower, 1-2ma would suffice and give me ridiculous battery life :)
 
Try connecting a bright LED between the 9V battery and the effect circuit then there will be no power wasted. The effect will probably work fine with the small voltage dropped by a 1.8V red LED. An older green LED would drop about 2.2V.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top