Abstract

In this paper optimal control for hybrid systems will be discussed. While defining hybrid systems as causal and consistent dynamical systems, a general formulation for an optimal hybrid control problem is proposed. The main contribution of this paper shows how necessary conditions can be derived from the maximum principle and the Bellman principle. An illustrative example shows how optimal hybrid control via a set of Hamiltonian systems and using dynamic programming can be achieved. However, as in the classical case, difficulties related to numerical solutions exist and are increased by the discontinuous aspect of the problem. Looking for efficient algorithms remains a difficult and open problem which is not the purpose of this contribution.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call