Abstract

Abstract Our first aim is to extend the Zhang Neural Network (ZNN) algorithmic conceptual framework, developed so far in the computation of the matrix inverse, pseudoinverse, and the Drazin inverse, to the class of outer inverses with prescribed range and null space. For this purpose, two ZNN models with two types of complex activation functions, designed for calculating computing outer inverses of time-varying complex matrices are presented. In addition, an appropriate finite-time ZNN model based on the usage of Li activation function is introduced. The design of these neural networks is based on appropriate Zhang functions (ZFs) arising from known limiting representations of outer inverses. The convergence properties of the introduced ZNN models are investigated and simulative results are presented in order to demonstrate their usability, generality and effectiveness. So far known results, corresponding to the Moore-Penrose inverse, the Drazin and the group inverse, can be derived simply as particular appearances of the models defined in the present paper. Restrictions on the spectrum are avoided as well as the requirements regarding the nonsingularity of certain matrices. Moreover, a hybrid combination of the ZNN model and the GNN model for computing outer inverses with global convergence is defined for constant real matrices.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.