Abstract

This paper deals with studying the asymptotical properties of neural networks used for the adaptive identification of nonlinearly parameterized system. To update the neural network's parameters, simple online gradient type learning algorithm is employed. A distinguishing feature of this algorithm is that its step size remains constant both in non-stochastic case and in stochastic case, and the learning set is infinite. Based on the Lyapunov-like concept, sufficient conditions guaranteeing the convergence of the learning algorithms are derived. Simulations are presented to support the theoretical results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call