"for most LEDs the current vs luminousity curve is almost linear"
Yeah... I may have actually been thinking of forward voltage drop vs current curve. I just misspoke. I'll leave it as is, people will just have to read this far to get the correct information.
Anyway, as for the OP's question. What I *THINK* he's asking is...
"Does
JUST the value of the resistor have an effect on the total efficiency of the circuits ability to convert electricity to light, if all other factors remaining unchanged?"
And as far as I know (assuming we can't change anything else about the circuit) the resistors value it's self does not effect the efficiency directly, only through the cause and effect it has on the circuit. Keeping in mind that we are fully aware that dropping the resistance has the effect of increases the current, and thus the power draw and light output. Also knowing full well that in the circuit discussed, the resistor has the biggest portion of the power being dissipated.
In other words, changing the resistance only effects efficiency because it effects the current through the total circuit, If it effects the efficiency at all.
Case in point. You can increase the efficiency of the circuit by starting the operating voltage closer to the voltage drop of the LED, then you would be using a smaller resistor for a given current/light output. This works because the proportions of power loss are shifted more onto the LED and less to the resistor. But changing the resistor alone will not have this effect because it changes the current through it's self
as well as the current through the LED. So the power dissipation
RATIO would remain the same no matter what you do to the resistor. You have to change the supply voltage or the LED along with the resistor to otherwise unbalance the equation. Or figure out a way around that pesky first law of thermal dynamics.
*HOWEVER* based on LED forward voltage drop vs current curve I would say yes it can actually have an effect on efficiency under certain scenarios.