Abstract

ABSTRACTIt is well known that disturbance can cause divergence of neural networks in the identification of nonlinear systems. Sufficient conditions using so‐called modified algorithms are available to provide guaranteed convergence for adaptive system. They are dead zone scheme, adaptive law modification, and σ‐modification. These schemes normally require knowledge of the upper bound of the disturbance. In this paper, a robust weighttuning algorithm is used to train the multi‐layered neural network with an adaptive dead zone scheme. The proposed robust adaptive algorithm does not require knowledge of either the upper bound of the disturbance or the bound on the norm of the estimate parameter. A complete convergence proof is provided based on Lyapunov theorem to deal with the nonlinear system. Simulation results are presented to show good perfor‐mance of the algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call