Abstract

Bellman’s discrete dynamic programming is one of the most general approaches to solve optimal control problems. For discrete-time dynamical systems, it is, at least theoretically, capable to determine globally optimal control laws. In most practical cases, both state and control variables are subject to constraints. Due to the necessity for gridding of the range of both state and control variables in numerical implementations of dynamic programming, the computational effort grows exponentially with increasing system dimensions. This fact is well known as the curse of dimensionality. Furthermore, gridding of intervals representing uncertain system parameters is inevitable, if dynamic programming is used for the design of optimal controllers for systems with uncertainties. In this contribution, an interval arithmetic procedure for the design of optimal and robust controllers is presented. This procedure relies on the basic concepts of dynamic programming. Sophisticated techniques for the exclusion of non-optimal control strategies significantly reduce the computational burden. Since interval techniques can be applied to both continuous-time and discrete-time dynamical systems, the interval arithmetic optimization approach presented in this chapter is applicable to both cases. In addition, the inclusion of effects of uncertain parameters in the underlying optimality criteria is demonstrated. For that purpose, interval arithmetic routines for analysis and design of optimal and robust controllers have been developed. Details about computationally efficient implementations of interval arithmetic optimization procedures and numerical results for a mechanical positioning system with statedependent switchings between different dynamical models for viscous and Coulomb friction are summarized.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call