Abstract

A bang-bang control law for digital-processor implementation is presented which directly implements a nonlinear switching surface for sub-time-optimal control of a single-input time-invariant linear plant of arbitrarily high order with unbalanced control levels. The control law requires a complete state vector and must usually be used with a state estimator or observer. Results are presented for attitude control of a space satellite having a single flexible mode with zero damping. The control law is a development of a predictive controller for a plant comprising cascaded pure integrators in which an analogue model of the plant is repeatedly run ahead in time to predict the future plant behaviour under both signs of extreme drive. The sign of real plant drive is determined using sign-change counts in the model runs. Direct digital implementation of this predictive control law results in unequal, and possibly excessive, periods between plant drive updating. These problems are eliminated in the proposed control-law algorithm by computing the number of sign changes direcly, generating the required sign of plant drive by means of a set of equations involving only the state estimate and the assumed plant-drive levels. Rapid determination of the plant drive is facilitated by the presence of only elementary mathematical operations in the algorithm. Application to a linear plant other than cascaded integrators is facilitated by generating the real-time plant model of the state estimator by transforming the plant state equation to the companion form, ensuring stable control in the neighbourhood of the state origin. In fact, no instabilities have been noted in simulations carried out to date, even for starting points relatively far removed from the origin of the state space.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call