Electronics4you said:
So I'll need a 1.5A*18V (maximum) = 27W resistor and thats a lot. I can't even get one under 30$. Is my calculation wrong?
Compromising: (18V/47R)*18V=6,89W - I can get that
yes completly wrong!
echoing Nigel.. where was 18V derived from
but then it comes back to the 1.5A requirement
MOSFET's are Voltage-controlled devices!, but they do require charging current (I lost a job interview because some ass-hat would not accept you need to pump current into the gate of a FET
)
Anyway At turn-on/off you might get 1.5A at the start (the actual inductance and resistance will determine the actual current... my gate-drives try to push 12A up an IGBT gate, but due to the inductance in the gate-leads they rarely get over 6A)
But as the gate-region charges the current flowing into the gate (due to the potential difference between the gate and the drive decreases) decreases as well - becauase effectively a capacitor is getting charged.
ONCE the FET is hard-ON/OFF no significant current will flow to/from the gate and thus the gate-resistor will disipate no power (it is only in switching events that it will)
Now if you just want to turn the FET on and then leave it on a 1/4W or 1/8W resistor should do. BUT if you want to PWM the FET you have to work with RMS current
The actual charging curve of a FET's gate capacitance isn't exactly the exponent of a normal capacitor (there is a nice millar effect at a certain voltage when the gate capacitance is effectily infinite in size) but a normal cap curve will survice
take that waveform and for yr given switching freq work out the RMS current and then go I^2R to work out the power that will be dissipated in the gate-resistor
I have a very nice Excel spreadsheet that does all this but it is on my work network (the python script that *should* do this is also a bit buggy)