Abstract

Different from gradient-based neural networks, a special kind of recurrent neural network has recently been proposed by Zhang <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">et</i> <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">al</i> for online matrix inversion. Such a neural network is designed based on a matrix-valued error function instead of a scalar-valued norm-based error function. In this paper, we develop and investigate a discrete-time model of Zhang neural network (termed as such and abbreviated to ZNN for presentation convenience), which is depicted by a system of difference equations. Compared with Newton iteration for matrix inversion, we find that the discrete-time ZNN model incorporates Newton iteration as one of its special cases. Noticing this relation, we perform numerical comparisons on different situations of using Zhang neural network and Newton iteration for the matrix inversion. Different kinds of activation functions and different step-size values are examined as well for the superior convergence and better stability of ZNN model. Numerical examples demonstrate the effectiveness of both ZNN model and Newton iteration for constant matrix inversion.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call