Abstract

This paper investigates noise/fault tolerant incremental algorithms for the extreme learning machine (ELM) concept. Existing incremental ELM algorithms can be classified into two approaches: non-recomputation and recomputation. This paper first formulates a noise/fault aware objective function for nonlinear regression problems. Instead of developing noise/fault aware algorithms for the two computational approaches in a one-by-one manner, this paper uses two representative incremental algorithms, namely incremental ELM (I-ELM) and error minimized ELM (EM-ELM), to develop two noise/fault aware incremental algorithms. The proposed algorithms are called generalized I-ELM (GI-ELM) and generalized EM-ELM (GEM-ELM). The GI-ELM adds k hidden nodes into the existing network at each incremental step without recomputing the existing weights. To have a fair comparison, we consider a modified version of I-ELM as a comparison algorithm. The simulation demonstrates that the noise/fault tolerance of the proposed GI-ELM is better than that of the modified I-ELM. In the GEM-ELM, k hidden nodes are added into the existing network at each incremental step. Meanwhile, all output weights are recomputed based on a recursive formula. We also consider a modified version of EM-ELM as a comparison algorithm. The simulation demonstrates that the noise/fault tolerance of the proposed GEM-ELM is better than that of the modified EM-ELM. Moreover, we demonstrate that the multiple set concept can further enhance the performance of the two proposed algorithms. Following our research results, one can make some non-noise/fault tolerant incremental algorithms to be noise/fault tolerant.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call