Nah, you understand it just fine.
I just couldn't figure out why you thought a a linear regulator or converter couldn't go down to near 0V so I thought maybe that was the case. But it wasn't and that post was deleted.
I see your problem now. It's because the 1.25V is being used as a reference for linear regulators and DC-DC converters and the resistive divider feedback normally used only lets you step down the the output voltage for comparison to the 1.25V. But you need to "step-up" the output voltage and compare it to the 1.25V in order to output a voltage less than that. The problem is getting around that 1.25V band gap reference.
Here's how to do it...
1. Use an power op-amp (due to your high power/currents) wired as a voltage follower (aka buffer).
2. Have a variable voltage divider (potentiometer) output feed into the input of the op-amp voltage follower
3. Have a linear regulator feed into the input of the divider (and best to also use it to power the op-amp).
Really...any input voltage for the divider (ie. switching, linear, or a voltage reference IC) will work as long as it's regulated because the divider is stepping down the input voltage and telling the op-amp voltage follower to output that voltage. If the base voltage that is being stepped down is inaccurate...well the output voltage will be too). The purpose of the op-amp voltage follower/buffer is for the current capability that you require since a voltage divider cannot output variable currents without also fluctuating it's output voltage.
You will want a rail-rail input and output op-amp in order to get very close to 0V. If you can't find a rail-rail power op-amp you might have to use bipolar supplies so the op-amp can output 0V (since with bipolar supplies, 0V is now in the middle of the range instead of the extreme).