Triode
Well-Known Member
I was considering posting this in firmware, but it's more of an overall question because it involves the electrical engineering of the motor and the drive electronics.
I have a workable brushless motor driver design and code. It reads hall effect sensors and using a look up table to generate commutation signals. As one would expect. If you take that and just vary the drive voltage, naturally the motor speed changes. The electronics don't have to do anything as they are already reacting to the motor's actual position.
Obviously in use, you want to include the speed control in the driver. For now I have ANDed a 25 kHz PWM signal with the commutation table, internally, and that seems to work. Is that the right way to do it?
My concern is how these interact. Here's my math (which could be based on flawed assumptions so let me know)
motor spins at up to 20kRPM = 333 per second.
12 windings, passes poles at max speed at 4 kHz
3 commutation states, so state changes at 12 kHz
So this means at maximum speed the power chopping could be only about 2 times as fast as the commutation switching. The phase alignment will be all over the place. That means some of the pulses are chopped right in the middle during phase, and others go almost uninterrupted. As far as variance is concerned, this will mechanically and electrically average out. But is there likely to be any damage from some of the pulses being too short? For example, what if the commutation turns on the driver, and 2ns later the PWM turns it off? Then the transistor may turn halfway on and back off again. But since it's only once in a while will this not really matter because the heat can't build up? Also that's only at max speed.
Is there a method generally used to align the phases? Is this a non-issue?Maybe there is some totally different way to do this I'm unaware of. It does seem to work, but some of the drivers have burnt out at lower load levels than others, making me think maybe it is affecting it. But it's very hard to tell.
I have a workable brushless motor driver design and code. It reads hall effect sensors and using a look up table to generate commutation signals. As one would expect. If you take that and just vary the drive voltage, naturally the motor speed changes. The electronics don't have to do anything as they are already reacting to the motor's actual position.
Obviously in use, you want to include the speed control in the driver. For now I have ANDed a 25 kHz PWM signal with the commutation table, internally, and that seems to work. Is that the right way to do it?
My concern is how these interact. Here's my math (which could be based on flawed assumptions so let me know)
motor spins at up to 20kRPM = 333 per second.
12 windings, passes poles at max speed at 4 kHz
3 commutation states, so state changes at 12 kHz
So this means at maximum speed the power chopping could be only about 2 times as fast as the commutation switching. The phase alignment will be all over the place. That means some of the pulses are chopped right in the middle during phase, and others go almost uninterrupted. As far as variance is concerned, this will mechanically and electrically average out. But is there likely to be any damage from some of the pulses being too short? For example, what if the commutation turns on the driver, and 2ns later the PWM turns it off? Then the transistor may turn halfway on and back off again. But since it's only once in a while will this not really matter because the heat can't build up? Also that's only at max speed.
Is there a method generally used to align the phases? Is this a non-issue?Maybe there is some totally different way to do this I'm unaware of. It does seem to work, but some of the drivers have burnt out at lower load levels than others, making me think maybe it is affecting it. But it's very hard to tell.