Abstract

Optimization methods are essential and have been used extensively in a broad spectrum of applications. Most existing literature on optimization algorithms does not consider systems that involve unknown system parameters. This article studies a class of stochastic adaptive optimization problems in which identification of unknown parameters and search for the optimal solutions must be performed simultaneously. Due to a fundamental conflict between parameter identifiability and optimality in such problems, we introduce a method of adding stochastic dither signals into the system, which provide sufficient excitation for estimating the unknown parameters, leading to convergent adaptive optimization algorithms. Joint identification and optimization algorithms are developed and their simultaneous convergence properties of parameter estimation and optimization variable updates are proved. Under both noise-free and noisy observations, the corresponding convergence rates are established. The main results of this article reveal certain fundamental relationships and tradeoff among updating step sizes, dither magnitudes, parameter estimation errors, optimization accuracy, and convergence rates. Simulation case studies are used to illustrate the adaptive optimization algorithms and their main properties.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.