Abstract
In this paper, we develop efficient methods for the computation of the Takagi components and the Takagi subspaces of complex symmetric matrices via the complex-valued neural network models. Firstly, we present a unified self-stabilizing neural network learning algorithm for principal Takagi components and study the stability of the proposed unified algorithms via the fixed-point analysis method. Secondly, the unified algorithm for extracting principal Takagi components is generalized to compute the principal Takagi subspace. Thirdly, we prove that the associated differential equations will globally asymptotically converge to an invariance set and the corresponding energy function attains a unique global minimum if and only if its state matrices span the principal Takagi subspace. Finally, numerical simulations are carried out to illustrate the theoretical results.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.