Abstract

In online sequential applications, a machine learning model needs to have a self-updating ability to handle the situation, which the training set is changing. Conventional incremental extreme learning machine (ELM) and online sequential ELM are usually achieved in two approaches: directly updating the output weight and recursively computing the left pseudo inverse of the hidden layer output matrix. In this paper, we develop a novel solution for incremental and decremental ELM (DELM), via recursively updating and downdating the generalized inverse of the hidden layer output matrix. By preserving the global optimality and best generalization performance, our approach implements node incremental ELM (N-IELM) and sample incremental ELM (S-IELM) in a universal form, and overcomes the problem of self-starting and numerical instability in the conventional online sequential ELM. We also propose sample DELM (S-DELM), which is the first decremental version of ELM. The experiments on regression and classification problems with real-world data sets demonstrate the feasibility and effectiveness of the proposed algorithms with encouraging performances.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call