Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

How to chop the signal for speed control of a brushless motor?

Status
Not open for further replies.

Triode

Well-Known Member
I was considering posting this in firmware, but it's more of an overall question because it involves the electrical engineering of the motor and the drive electronics.

I have a workable brushless motor driver design and code. It reads hall effect sensors and using a look up table to generate commutation signals. As one would expect. If you take that and just vary the drive voltage, naturally the motor speed changes. The electronics don't have to do anything as they are already reacting to the motor's actual position.

Obviously in use, you want to include the speed control in the driver. For now I have ANDed a 25 kHz PWM signal with the commutation table, internally, and that seems to work. Is that the right way to do it?

My concern is how these interact. Here's my math (which could be based on flawed assumptions so let me know)

motor spins at up to 20kRPM = 333 per second.
12 windings, passes poles at max speed at 4 kHz
3 commutation states, so state changes at 12 kHz

So this means at maximum speed the power chopping could be only about 2 times as fast as the commutation switching. The phase alignment will be all over the place. That means some of the pulses are chopped right in the middle during phase, and others go almost uninterrupted. As far as variance is concerned, this will mechanically and electrically average out. But is there likely to be any damage from some of the pulses being too short? For example, what if the commutation turns on the driver, and 2ns later the PWM turns it off? Then the transistor may turn halfway on and back off again. But since it's only once in a while will this not really matter because the heat can't build up? Also that's only at max speed.

Is there a method generally used to align the phases? Is this a non-issue?Maybe there is some totally different way to do this I'm unaware of. It does seem to work, but some of the drivers have burnt out at lower load levels than others, making me think maybe it is affecting it. But it's very hard to tell.
 
I've found something about chopping on the low side steps, since those can often switch faster. I'll have to give that a try, though I'm short on details. I'll experiment with what I have. Luckily I have a scope with 4 channels and some current probes, so I should be able to see if there is any problem with switching.
1594675650780.png
 
Rather than chop pulses, you could just introduce/extend delays between them in code. The greater the delay the slower the motor will run.
 
Using hall sensors for switching implies a BLDC motor where only two field windings are switched at any one time, if controlling all 3 windings then it is used as a 3ph motor. for which the commutation is slightly different.
The chart below shows the relationship between hall sensor and winding, it shows the generated signal due to the motor being back fed.(rotated)
Max..
 

Attachments

  • commutation.pdf
    136.6 KB · Views: 186
Also as the MC33035 App note shows, it can be used in the BLDC or 3 phase mode.
Also will run a DC brushed motor when set to 60deg commutation.
Max.
 
This may be an over-symplistic reply, and this is not meant to be insulting in any way...have you tried one of the many Brushless speed controllers available for RC use??.
 
This may be an over-symplistic reply, and this is not meant to be insulting in any way...have you tried one of the many Brushless speed controllers available for RC use??.

That would be sound advice in some cases, but in this case I am building BLDC drivers to get better at designing them. Also for much of what I do I am planning to eventually design them for servos, and the response time on PWM controlled drivers isn't usually fast or precise enough.
 
Rather than chop pulses, you could just introduce/extend delays between them in code. The greater the delay the slower the motor will run.

I was thinking something like that may be way more efficient, especially when you get into servo applications for the driver. I can't picture exactly how to pull that off. Though I could probably find a very hacky way to do it. I wonder if there are any examples.

I guess timers don't take very long to set. I could have it so when one sensor pulse comes in you do a simple calculation for when the next pulse would occur if you were at the desired speed, then set a timer that must finish before that pulse is allowed to occur. I'm just thinking that may be less lag than trying to calculate and set the delay on the current pulse.
 
I did something like this some years ago when designing a speed control for a 2-wire BLDC motor (computer fan) which had internal commutation electronics of unknown flavour and no external access. I used a low-side-switching power transistor with a series current-sense resistor. Each time the sensed motor drive current dropped to zero during commutation I triggered a monostable and kept the transistor switched off for the duration of the monostable period. Simple. Varying the monostable period gave very smooth speed control down to almost zero rpm. This method avoids conflict between internal and external switching.
 
I did something like this some years ago when designing a speed control for a 2-wire BLDC motor (computer fan) which had internal commutation electronics of unknown flavour and no external access. I used a low-side-switching power transistor with a series current-sense resistor. Each time the sensed motor drive current dropped to zero during commutation I triggered a monostable and kept the transistor switched off for the duration of the monostable period. Simple. Varying the monostable period gave very smooth speed control down to almost zero rpm. This method avoids conflict between internal and external switching.
Hi alec_t
Would you mind showing if not a schematic, a simplified sketch?
Gracias.
 
Can't find my original schematic, but here's the general idea, Agustin :

Synchronous pulse delay commutation.png

The arrangement effectively stretches the normal brief current interruptions as the windings are commutated. As the transistor is turned off when there is already ~zero current through it, there is no significant BEMF generated so no interference with the internal switching in the motor.
 
Can't find my original schematic, but here's the general idea, Agustin :

View attachment 125938
The arrangement effectively stretches the normal brief current interruptions as the windings are commutated. As the transistor is turned off when there is already ~zero current through it, there is no significant BEMF generated so no interference with the internal switching in the motor.

Gracias Alec

Clever, on account you do not interfere with the current commutation but take advantage of it. Nice!
 
Last edited:
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top