Hello there Nigel and thanks for the reply.
I mentioned the calculation of the series diode because it is really impossible to just throw any old resistor value in there because the effect could go from not enough to way too much. An example follows.
We have one diode D1 dropping 0.8v and the other D2 dropping 0.9v with D1 current of 2 amps and D2 current of 1 amp. The formula in this case is:
R=(V2-V1)/(I1-I2)
and plugging in the values we get:
R=0.1 Ohms
which matches one of your suggestions.
Now right off we can see that if you used a 0.22 Ohm resistor, we would be dissipating more power than need be, but that's not the worst of it.
Next we have D1 dropping 0.8v again and D2 dropping 0.9v, but this time D1 current is 2 amps and D2 current is 1.9 amps. Plugging these values in we get:
R=1 Ohms.
and that is what it takes to equalize the two currents.
It is possible that we might want to go through another iteration of these calculations but you can see the difference. If we chose 0.22 Ohms for the first case we would be dissipating too much power, and if we chose 0.22 Ohms for the second case we would not be compensating enough.
An extreme case would be I1=100 amps and I2=80 amps, V1=1.8v and V2=2v, then we get:
R=0.01 Ohms.
Big difference, but not only that, if we go with 0.22 Ohms for this example we drop around 40 volts, and even with 0.1 Ohms we drop around 20 volts, and for a 30 volt power supply that would not be very good.
I happened to specialize in power supplies back in the day so I might have a lot to say on this subject