Optikon
New Member
Hi all,
I have an application where I will be using a power MOSFET (N-Channel) as a switch. I would like this device to transition through its linear region as rapidly as possible and I am contemplating the drive circuit for this.
A typical drive circuit would consist of a low-impedance voltage source as the gate drive where at high frequencies, the lower the source impedance can be, the better. One of the problems for the voltage drive is the gate capacitance to ground(common return) since the moderate output impedance (at higher freq) interacts with that parasitic capacitance (gate charging is slowed).
And for large geometry power devices, this is the worst case since these parasitics tend to be larger. I have in the past designed a push-pull unity gain buffer as the driver that is good out to over 100MHz but I would like to try something else as an alternative this time-round.
I would like to know if anyone has ever experimented with driving devices like this with a current source instead. My line of thinking goes something like the following:
1) A current source is not bothered by the gate capacitance (the one to ground that is) and so the parasitic that limited to voltage source drive is easily overcome.
2) The current source will not like parasitic inductance however, but this is EASY! (Or so I think). The MOS device itself is not inherently inductive other than its leads which, I can keep very very short. I can also lay the design out so that I do not introduce any additional L with the routing/wiring.
3) But alas, some inductance will remain and it should be in the 10's of nH if things go well. This will present an L(dI/dt) effect that will cause a voltage transient that could possibly cause my current source to turn off. I can increase my power supply voltage to a level that will keep my current source working and give acceptable results.
I would somehow need to turn on my current source quickly so it can supply adequate charge to the gate and transition through the gate-charge spec very quickly.
I do not have a schematic yet and I do not have a specific turn on time requirement. I also do not have any special circumstances for the design such as isolation or inductive loads on the drain etc.. I will be driving this system open-loop (On-off style) I don't have a power mos part in mind yet either but it wil be typical of a 100W device with Vth = 2-4V and milli-ohm Rds(ON) with Vgs = 10V.
I am looking for anyones thoughts on trying this method and problems I might encounter. I am very curious to discover what the limiting factors will be. I have read a thread on voltage-controlled versus current controlled transistors and I don't want to go there. Please comment on driving the gate with a current source.
THANKS!!
I have an application where I will be using a power MOSFET (N-Channel) as a switch. I would like this device to transition through its linear region as rapidly as possible and I am contemplating the drive circuit for this.
A typical drive circuit would consist of a low-impedance voltage source as the gate drive where at high frequencies, the lower the source impedance can be, the better. One of the problems for the voltage drive is the gate capacitance to ground(common return) since the moderate output impedance (at higher freq) interacts with that parasitic capacitance (gate charging is slowed).
And for large geometry power devices, this is the worst case since these parasitics tend to be larger. I have in the past designed a push-pull unity gain buffer as the driver that is good out to over 100MHz but I would like to try something else as an alternative this time-round.
I would like to know if anyone has ever experimented with driving devices like this with a current source instead. My line of thinking goes something like the following:
1) A current source is not bothered by the gate capacitance (the one to ground that is) and so the parasitic that limited to voltage source drive is easily overcome.
2) The current source will not like parasitic inductance however, but this is EASY! (Or so I think). The MOS device itself is not inherently inductive other than its leads which, I can keep very very short. I can also lay the design out so that I do not introduce any additional L with the routing/wiring.
3) But alas, some inductance will remain and it should be in the 10's of nH if things go well. This will present an L(dI/dt) effect that will cause a voltage transient that could possibly cause my current source to turn off. I can increase my power supply voltage to a level that will keep my current source working and give acceptable results.
I would somehow need to turn on my current source quickly so it can supply adequate charge to the gate and transition through the gate-charge spec very quickly.
I do not have a schematic yet and I do not have a specific turn on time requirement. I also do not have any special circumstances for the design such as isolation or inductive loads on the drain etc.. I will be driving this system open-loop (On-off style) I don't have a power mos part in mind yet either but it wil be typical of a 100W device with Vth = 2-4V and milli-ohm Rds(ON) with Vgs = 10V.
I am looking for anyones thoughts on trying this method and problems I might encounter. I am very curious to discover what the limiting factors will be. I have read a thread on voltage-controlled versus current controlled transistors and I don't want to go there. Please comment on driving the gate with a current source.
THANKS!!