Abstract
This 144-page book offers a concise introduction to optimal control theory and differential games, from the minimum principle (MP) to Hamilton-Jacobi-Bellman (HJB) theory. The book is based on lecture material developed by the author for a one- semester senior-year undergraduate or a fi rst-year graduate course, which he has taught at the Swiss Federal Institute of Technology at Zurich (ETHZ). The target audience of the book is practitioners in the field rather than academics or theoreticians working in the area of optimal control. To this end, the author does an excellent job of keeping the mathematical necessities to a minimum. The only prerequisites for the reader to follow the material in the book are system dynamics, state-space representations, and differential calculus. The author concentrates on three major topics, namely,the derivation of open-loop optimal controllers using MP,the conversion of optimal open-loop controls to optimal closed-loop controls, and the direct derivation of optimal feedback controls using HJB theory. In addition, in the final chapter the author offers a brief glimpse into the area of differential games. The style of the book is simplistic, sacrificing rigor for accessibility, which is appropriate for the intended readership. The book concentrates on the main ideas, leaving the details to a series of worked-out examples in each chapter. Overall, this works well, and it should satisfy the reader who wants to become familiar with the gist of what optimal control theory is all about.
Paper version not known (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have