Abstract

Wang proposed a gradient-based neural network (GNN) to solve online matrix-inverses. Global asymptotical convergence was shown for such a neural network when applied to inverting nonsingular matrices. As compared to the previously-presented asymptotical convergence, this paper investigates more desirable properties of the gradient-based neural network; e.g., global exponential convergence for nonsingular matrix inversion, and global stability even for the singular-matrix case. Illustrative simulation results further demonstrate the theoretical analysis of gradient-based neural network for online matrix inversion.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call