Hi everyone,
I've got a pump which is being driven by a PIC, using a voltage signal and a pressure transducer for feedback. The main task for the PIC is to keep steady pressure in a small chamber by varying the voltage. The pump fills up the chamber and when it's full the pressure starts to rise, thus changing readings from the transducer.
I've got the PID loop pretty much figured out and tuned, but I'm having some problems with the initial phase of the whole process. Because there's a flow restrictor just before the chamber, it's takes a while for the pump to fill it, roughly 4-5s. This means that when the pump is started the pressure sits on 0bar for the first 4-5s and afterwards shoots quite quickly to 10bar (max possible pressure), causing an overshot pretty much everytime regardless of the setpoint in the system. This is caused by the PID controller rising the voltage to the maximum value during the initial phase, the controller 'thinks' the pressure isn't rising due to the control signal being too small and keeps increasing it until it reaches the max value. Of course when the chamber is full and the pump is running at full speed the pressure shoots up pretty quickly and the PID takes 2-3s to get down to the proper level.
So the question is, is there a way to avoid this? After the chamber is full the system is pretty responsive and I'm using a 1000ms sample loop interval for the PID to control the system which works well, it's only the initial phase that causes problems. I was thinking about putting a 3-4s delay between the pump ignition and PID startup, but I'd have to start the pump at full voltage (to fill the chamber quickly) and start the PID at zero voltage in order to make it work, so it won't work properly. It will either overshoot if I set the control signal to max when initiating the PID, or will dip down when I initiate it with 0.
Any ideas?
Regards,
dsc.
I've got a pump which is being driven by a PIC, using a voltage signal and a pressure transducer for feedback. The main task for the PIC is to keep steady pressure in a small chamber by varying the voltage. The pump fills up the chamber and when it's full the pressure starts to rise, thus changing readings from the transducer.
I've got the PID loop pretty much figured out and tuned, but I'm having some problems with the initial phase of the whole process. Because there's a flow restrictor just before the chamber, it's takes a while for the pump to fill it, roughly 4-5s. This means that when the pump is started the pressure sits on 0bar for the first 4-5s and afterwards shoots quite quickly to 10bar (max possible pressure), causing an overshot pretty much everytime regardless of the setpoint in the system. This is caused by the PID controller rising the voltage to the maximum value during the initial phase, the controller 'thinks' the pressure isn't rising due to the control signal being too small and keeps increasing it until it reaches the max value. Of course when the chamber is full and the pump is running at full speed the pressure shoots up pretty quickly and the PID takes 2-3s to get down to the proper level.
So the question is, is there a way to avoid this? After the chamber is full the system is pretty responsive and I'm using a 1000ms sample loop interval for the PID to control the system which works well, it's only the initial phase that causes problems. I was thinking about putting a 3-4s delay between the pump ignition and PID startup, but I'd have to start the pump at full voltage (to fill the chamber quickly) and start the PID at zero voltage in order to make it work, so it won't work properly. It will either overshoot if I set the control signal to max when initiating the PID, or will dip down when I initiate it with 0.
Any ideas?
Regards,
dsc.
Last edited: