Abstract

The use of non-linear transformations of variables and the theory of implicit functions yield sufficient conditions for the stabilization of the unperturbed motion of a certain class of non-linear control systems. Upon stabilization one can achieve Lyapunov stability and asymptotic stability with respect to some of the variables. Closed formulae are obtained which make it possible to organize an iterative process which will determine the most satisfactory (optimal) stabilization law from the practical point of view. A technique is worked out by which the construction of control laws in the original non-linear system can be reduced to the construction of control laws for an auxiliary linear control system of a simpler type. This technique is very similar to a principle that has become quite popular in the modern applied theory of automatic control — the iterative construction of optimal control laws. As an application, the technique is used to stabilize the equilibrium position of a rigid body by means of Cardan-suspended gyroscopes and motors in which the tractive force is continuously regulated.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call