Abstract
The problem of uniformly optimal control is posed in the context of linear time-varying discrete-time systems; the controller is optimal uniformly over a pre-specified set of exogenous signals. Existence of an optimal controller is proved and a formula for the minimum cost is derived. The time-invariant case is treated in the frequency domain. It is shown that for time-invariant systems an optimal time-varying controller is no better than an optimal time-invariant controller.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have