I was always under the presumption that if you had a circuit for an LED and a series resistor to drop voltage, you would be wasting power by heating the resistor. That said, I was looking for an LED that would consume less than 5ma for a project of mine and was going to buy some 1ma LEDs when someone told me about just using a bright LED and a larger resistor. I did some testing and this is what I found.
8.65v source voltage at battery. Red LED with 268 ohm resistor in series, 2.21v at LED and 22.6ma draw on circuit (measured between the battery and the resistor). Same Red LED but with 1,256 ohms of resistance in series, only 1.93v at LED and only drawing 5.2ma in the circuit. I could have gone a little higher on the resistor because the LED was still bright enough for my purposes.
So, do resistors consume or "waste" electricity when in series, or only when in parallel from V+ to ground (as in a voltage divider)?
8.65v source voltage at battery. Red LED with 268 ohm resistor in series, 2.21v at LED and 22.6ma draw on circuit (measured between the battery and the resistor). Same Red LED but with 1,256 ohms of resistance in series, only 1.93v at LED and only drawing 5.2ma in the circuit. I could have gone a little higher on the resistor because the LED was still bright enough for my purposes.
So, do resistors consume or "waste" electricity when in series, or only when in parallel from V+ to ground (as in a voltage divider)?