Abstract

Three parallel gaps in robust feedback control theory are examined: sufficiency versus necessity, deterministic versus stochastic uncertainty modeling, and stability versus performance. Deterministic and stochastic output-feedback control problems are considered with both static and dynamic controllers. The static and dynamic robust stabilization problems involve deterministically modeled bounded but unknown measurable time-varying parameter variations, while the static and dynamic stochastic optimal control problems feature state-, control-, and measurement-dependent white noise. General sufficiency conditions for the deterministic problems are obtained using Lyapunov's direct method, while necessary conditions for the stochastic problems are derived as a consequence of minimizing a quadratic performance criterion. The sufficiency tests are then applied to the necessary conditions to determine when solutions of the stochastic optimization problems also solve the deterministic robust stability problems. As an additional application of the deterministic result, the modified Riccati equation approach of Petersen and Hollot is generalized in the static case and extended to dynamic compensation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.