Abstract

The incremental extreme learning machine (I-ELM) algorithm provides a low computational complexity training mechanism for single-hidden layer feedforward neworks (SLFNs). However, the original I-ELM algorithm does not consider the node noise situation, and node noise may greatly degrade the performance of a trained SLFN. This paper presents a generalized node noise resistant I-ELM (GNNR-I-ELM) for SLFNs. We first define a noise resistant training objective function for SLFNs. Afterwards, we develop the GNNR-I-ELM algorithm which adds \(\tau \) nodes into the network at each iteration. The GNNR-I-ELM algorithm estimates the output weights of the newly additive nodes and does not change all the previously trained output weights. Its noise tolerant ability is much better than that of the original I-ELM. Besides, we prove that in terms of the training set mean squared error of noisy networks, the GNNR-I-ELM algorithm converges.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.