Hi there,
Anything you use to drop the voltage will lose quite a bit of power,
except for a high efficiency switching buck regulator.
Lets say we use a Schottky diode with 0.4v drop, and lets say
that we got incredibly lucky that our voltage source was exactly
0.4v higher than the LED drop at it's rated current (this will almost
NEVER happen in real life BTW). And, lets say that our LED drops
exactly 3.6v in this circuit, and draws 100ma.
The power getting to the LED is 0.1 times 3.6 which equals 0.36 watts,
and the power coming from the battery is 0.1 times 4.0, which equals
0.40 watts. So, we are putting out 0.40 watts and only using 0.36
watts, so we are wasting 0.40 minus 0.36 which equals 0.04 watts,
and that may not seem like much but it is after all 10 percent of the
total power being wasted. It's interesting in that if we chose a resistor
that dropped exactly 0.4 volts we would end up with the same power
loss; no difference.
But that circuit is never going to happen anyway because the LED
cannot be driven from a voltage source though a diode because the
diode drop is too 'hard', while a resistor is 'soft', so to speak.
The resistor allows the LED to run near its rated current and allows
for a little wiggle room in the LED voltage drop, while the diode will
not allow this.
So, the trick is to pick a resistor that matches the LED and the
voltage source. To do this we have to calculate a few things
like power supply swing and diode voltage swing, and do a min
and max calculation on the current in the diode. If it works out,
we've got the right resistor value.
If you need more information on choosing the right resistor for
your LED, you will have to supply some more info like what
kind of LED it is, and what kind of voltage source you will be
using with it.