Abstract

<p style='text-indent:20px;'>We study the use of single hidden layer neural networks for the approximation of Lyapunov functions in autonomous ordinary differential equations. In particular, we focus on the connection between this approach and that of the meshless collocation method using reproducing kernel Hilbert spaces. It is shown that under certain conditions, an optimised neural network is functionally equivalent to the RKHS generalised interpolant solution corresponding to a kernel function that is implicitly defined by the neural network. We demonstrate convergence of the neural network approximation using several numerical examples, and compare with approximations obtained by the meshless collocation method. Finally, motivated by our theoretical and numerical findings, we propose a new iterative algorithm for the approximation of Lyapunov functions using single hidden layer neural networks.</p>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call