Abstract

Extreme Learning Machine (ELM) has proven to be an efficient and speedy algorithm for classification. In order to generalize the results of standard ELM, several ensemble meta-algorithms have been implemented. On this manuscript, we propose a hierarchical ensemble methodology that promotes diversity among the elements of an ensemble, explicitly through the loss function in the single-hidden-layer feedforward network version of ELM. The diversity term in the loss function is justified using the concept of regularization from the Negative Correlation Learning framework. Statistical tests show that our proposal is competitive in both performance and diversity measures against bagging and boosting ensemble methodologies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call