Abstract

The theory of control of differential equations has developed in several directions in close relation with the practical applications of the theory. Its evolution has shown that its methods and tools are drawn from a large spectrum of mathematical branches such as ordinary differential equations, real analysis, calculus of variations, mechanics, and geometry. The chapter presents some aspects and ideas in the classical Calculus of variations that lead to the modern theory of optimal control for differential equations. Some preliminary material is presented. It contains elements of convex analysis and the generalized differential calculus for locally Lipschitz functionals, introduced by EH. Clarke. The exponential representation of flows is discussed in order to give a geometric formulation to the maximum principle. The Pontriaghin maximum principle is concerned for general Bolza problems. The dynamic programming method in optimal control problems based on the partial differential equation of dynamic programming, or Bellman equation is also presented in the chapter.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call