Abstract
By using the hierarchical identification principle, based on the conventional gradient search, two neural subsystems are developed and investigated for the online solution of the well-known Lyapunov matrix equation. Theoretical analysis shows that, by using any monotonically-increasing odd activation function, the gradient-based neural networks (GNN) can solve the Lyapunov equation exactly and efficiently. Computer simulation results confirm that the solution of the presented GNN models could globally converge to the solution of the Lyapunov matrix equation. Moreover, when using the power-sigmoid activation functions, the GNN models have superior convergence when compared to linear models.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have