Abstract

Extreme Learning Machine (ELM) is a promising learning scheme for nonlinear classification and regression problems and has shown its effectiveness in the machine learning literature. ELM represents a class of generalized single hidden layer feed-forward networks (SLFNs) whose hidden layer parameters are assigned randomly resulting in an extremely fast learning speed along with superior generalization performance. It is well known that the online sequential learning algorithm (OS-ELM) based on recursive least squares [1] might result in ill-conditioning of the Hessian matrix and hence instability in the parameter estimation. To address this issue, the stability theory of Lyapunov is utilized to develop an online learning algorithm for temporal data from dynamic systems and time series. The developed algorithm results in parameter estimation that is asymptotically stable leading to boundedness in model states. Simulations results of the developed algorithm compared against online sequential ELM (OS-ELM) and the offline batch learning ELM (O-ELM) show that the Lyapunov ELM algorithm can perform online learning at reduced computation, comparable accuracy and with a guarantee on the boundedness of the estimated system.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.