Someone Electro said:
the eqvasion is:
(BateryVoltege - LEDVoltege) / LEDCurent = Resistance
It is that easy to calculate. For your 9V battery and a single 3.5V green LED, the resistor will have 5.5V across it (the subtraction part of the equation). and Ohm's Law says that for 20mA to flow in the resistor (and also to flow in the LED because they are in series), the resistor value is (voltage divided by current) 275 ohms. Use 270 ohms which is a very close stock value, and the current will be 20.4mA.
The power in the 270 resistor is calculated as the voltage across it squared (multiplied by itself) divided by the resistance, so 5.5 squared is 30.25, divided by 270 is 112mW. A 1/4W resistor will be slightly warm.
When the battery voltage drops down to 6V, the current is calculated to be 9.3mA, but it will actually be a little more because the LED's curves show that it will have a voltage drop of 3.35V at 10mA.
So 6.0 - 3.35 / 270 = 9.8mA. It will be almost as bright as at 20mA, because the eye's response is logarithmic.
A 9V battery doesn't have much power capacity, and when it drives a 3.5V LED most of its power is wasted in the current-limiting resistor. That's why small commercial LED flashlights use two AA cells (4 to 5 times the power capacity of a 9V battery) and a voltage step-up IC. The IC has regulation to keep the LED current the same, as the battery voltage drops.
A little key-ring LED flashlight doesn't even have a circuit. Just an LED (3.5V white one) and 3 tiny battery cells. The internal resistance of those tiny cells drops their 4.5V down to about 3.5V at about 20mA. Those tiny cells don't last long.
Think again about using only green. If you shine a green LED on a red stop sign in the dark, then you will see nuthin'.