I was taking a look at switching converters again. Specifically PMOS buck converters that can work at 100% duty cycle, but suffer from the defficiencies of PMOS (relative to NMOS). It seems that there's quite a bit of effort that goes into using NMOS's in buck converters because of their better characterstics (which also means better availability), but then it requires added complexity because of high-side boostraping, etc, and as a result they still can't work at 100% duty cycle (not without massive costs and technical problems associated with a floating switching gate drive supply which is silly in a buck converter since it's more complicated than the buck converter itself).
It seems that this problem has it's root in the legacy mistake of how charges were defined as flowing from positive to negative (conventional flow), and so this "conventional charge sink" was defined as ground. Since then all electronics have been built using the conventional charge sink as ground.
But to my understanding, electronics could have just been as easily designed so that the sounrce of conventional charges/holes (ie. the electron sink) is the reference. And it would have probably been the case if things were defined correctly at the start. And if that did happen, then the NMOS could easily be used on the of a buck converter without any high-side drive complications.
Is this reasoning correct? Particularily the part in bold. It's just so pervasive now that it's hard to be sure if you could resdesign all electronics so that (I will describe the following in terms of conventional current flow), the highest +V in the system is the reference, and and the various voltage supplies are derived by being at varying potentials below +V. Basically flipping things around so the electron sink is actually the reference voltage of the system.
Analogy: Instead of dropping a different balls from different altitudes so that it falls onto level concrete, we now drop all the balls from the same altitude so that they fall into holes of various depths.
It seems that this problem has it's root in the legacy mistake of how charges were defined as flowing from positive to negative (conventional flow), and so this "conventional charge sink" was defined as ground. Since then all electronics have been built using the conventional charge sink as ground.
But to my understanding, electronics could have just been as easily designed so that the sounrce of conventional charges/holes (ie. the electron sink) is the reference. And it would have probably been the case if things were defined correctly at the start. And if that did happen, then the NMOS could easily be used on the of a buck converter without any high-side drive complications.
Is this reasoning correct? Particularily the part in bold. It's just so pervasive now that it's hard to be sure if you could resdesign all electronics so that (I will describe the following in terms of conventional current flow), the highest +V in the system is the reference, and and the various voltage supplies are derived by being at varying potentials below +V. Basically flipping things around so the electron sink is actually the reference voltage of the system.
Analogy: Instead of dropping a different balls from different altitudes so that it falls onto level concrete, we now drop all the balls from the same altitude so that they fall into holes of various depths.
Last edited: