Abstract
Many problems in engineering can be solved by minimizing a measure of cost or maximizing a measure of performance. The designer must select a suitable performance measure based on his or her understanding of the problem to include the most important performance criteria and reflect their relative importance. The designer must also select a mathematical form of the function that makes solving the optimization problem tractable. This chapter introduces optimal control theory for discrete-time systems. It begins with unconstrained optimization of a cost function and then generalizes to optimization with equality constraints. It also considers the problem of minimizing a cost function or performance measure; then extends the solution to problems with equality constraints. Following this, it covers the optimization or optimal control of discrete-time systems. It then specializes to the linear quadratic regulator and obtains the optimality conditions for a finite and for an infinite planning horizon. In addition, this chapter addresses the regulator problem where the system is required to track a nonzero constant signal.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.