Abstract

As a flexible and efficient cost sensitive learning algorithm, the label weighted extreme learning machine (LW-ELM) has been proposed to address the class imbalance learning problem on the multi-label data. However, due to the adoption of empirical costs, the classification performance of LW-ELM can't be guaranteed enough. To solve this problem, an improved algorithm called BLW-ELM, which integrates LW-ELM into the Boosting ensemble learning framework, is presented in this paper. Specifically, BLW-ELM designates the appropriate cost for each training label belonging to each training instance according to the iterative feedbacks of the training results, further gets rid of exploring the intricate distribution of multi-label data directly. That is to say, BLW-ELM is an universal and self-adapting algorithm that can promotes the robustness of classification regardless of the data distribution types. Twelve multi-label data sets are used to verify the effectiveness and superiority of the proposed algorithm. Experimental results indicate that the proposed BLW-ELM algorithm is significantly superior to LW-ELM algorithm and many other state-of-the-art multi-label imbalance learning algorithms, as well it generally needs far less training time than those sophisticated algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call