Abstract

Ensemble approaches introduced in the Extreme Learning Machine literature mainly come from methods that rely on data sampling procedures, under the assumption that the training data are heterogeneously enough to set up diverse base learners. To overcome this assumption, it was proposed an ELM ensemble method based on the Negative Correlation Learning framework, called Negative Correlation Extreme Learning Machine (NCELM). This model works in two stages: (i) different ELMs are generated as base learners with random weights in the hidden layer, and (ii) a NCL penalty term with the information of the ensemble prediction is introduced in each ELM minimization problem, updating the base learners, (iii) second step is iterated until the ensemble converges. Although this NCL ensemble method was validated by an experimental study with multiple benchmark datasets, no information was given on the conditions about this convergence. This paper mathematically presents sufficient conditions to guarantee the global convergence of NCELM. The update of the ensemble in each iteration is defined as a contraction mapping function, and through Banach theorem, global convergence of the ensemble is proved.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.