Performance optimization with robustness constraints is frequently encountered in process control. Motivated by the analytical difficulties in dealing with the conventional robustness index, e.g., maximum sensitivity, we introduce the relative delay margin as a good alternative, which gives much simpler robust analysis. This point is illustrated by designing an optimal PI controller for the first-order-plus-dead-time (FOPDT) model. It is first shown that the PI controller parameters can be analytically derived in terms of a new pair of parameters, i.e., the phase margin and gain crossover frequency. The stability region of PI controller is subsequently obtained with a much simpler procedure than the existing approaches. It is further shown that a certain relative delay margin can represent the robustness level well and the contour can be sketched with a simpler procedure than the one using maximum sensitivity index. With constraints on the relative delay margin, an optimal disturbance rejection problem is then formulated and analytically solved. Simulation results show that the performance of the proposed methodology is better than that of other PI tuning rules. In this paper, the relative delay margin is shown as a promising robustness measure to the analysis and design of other advanced controllers.
Read full abstract