Hi again FB,
As I now understand it, this is your system.
(1) A 1,600uF capacitor charges from 60V to 300V (dV of 240V) in 1 second
(2) A flash bulb triggers and conducts for 1 mili Sec and reduces the capacitor voltage to 60V
(3) The capacitor charges and so the cycle continues.
You have a requirement to minimize the loss in the capacitor and the only area that you have any control of is the charging of the capacitor. The flash discharge just happens in 1 mili Sec as far as you are concerned.
With the above in mind, the minimum dissipation in the capacitor and the least stress on the capacitor would be achieved by charging the capacitor with a constant current of 384 mili Amps, which will produce a linear voltage ramp across the capacitor from 60V to 300V in 1 Sec.
The formula for deriving the required constant charging current (Ik) is:
Ik = (C * dV)/t ...(f1)
Where,
Ik: Constant charge current in Amps
C: Capacitance in Farads
dV: Voltage difference in Volts
t: Time in seconds
Thus from (f1),
Ik = (1.6 * 10^-3 * 240)/1 = 384 mili Amps
Once the above is done, the only other way to further reduce the dissipation in the capacitor would be to extend the interval between firing the flash.
May I suggest that, to save time, you describe to the capacitor manufacturer the above and ask if their capacitor would be up to the job.
In terms of a practical implementation, a current limiting fly-back converter would not be that far off the ideal either.
I am sorry and glad to say that there has been an incorrect assumption about the waveform that a fly-back converter would produce when charging the capacitor. In fact, in one second, a 100K Hz fly-back converter will produce 100K minute voltage steps, that will be little removed from the ideal linear ramp generated by a pure constant current.
spec