Abstract
In this paper, a special recurrent neural network (RNN), i.e., the Zhang neural network (ZNN), is presented and investigated for online time-varying nonlinear optimization (OTVNO). Compared with the research work done previously by others, this paper analyzes continuous-time and discrete-time ZNN models theoretically via rigorous proof. Theoretical results show that the residual errors of the continuous-time ZNN model possesses a global exponential convergence property and that the maximal steady-state residual errors of any method designed intrinsically for solving the static optimization problem and employed for the online solution of OTVNO is O(τ), where τ denotes the sampling gap. In the presence of noises, the residual errors of the continuous-time ZNN model can be arbitrarily small for constant noises and random noises. Moreover, an optimal sampling gap formula is proposed for discrete-time ZNN model in the noisy environments. Finally, computer-simulation results further substantiate the performance analyses of ZNN models exploited for online time-varying nonlinear optimization.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.