Abstract

Extreme learning machine (ELM) has been one widely used learning paradigm to train single hidden layer feedforward network (SLFN). However, like many other classification algorithms, ELM may learn undesirable class boundaries from data with unbalanced classes. This paper first tries to analyze the reason of the damage caused by class imbalance for ELM, and then discusses the influence of several data distribution factors for the damage. Next, we present an optimal decision outputs compensation strategy to deal with the class imbalance problem in the context of ELM. Specifically, the outputs of the minority classes in ELM are properly compensated. For a binary-class problem, the compensation can be regarded as a single variable optimization problem, thus the golden section search algorithm is adopted to find the optimal compensation value. For a multi-class problem, the particle swarm optimization (PSO) algorithm is used to solve the multivariate optimization problem and to provide the optimal combination of compensations. Experimental results on lots of imbalanced data sets demonstrate the superiority of the proposed algorithm. Statistical results indicate that the proposed approach not only outperforms the original ELM, but also yields better or at least competitive results compared with several widely used and state-of-the-art class imbalance learning methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.