Abstract

Online class imbalance learning is an emerging learning area that combines the challenges of both online learning and class imbalance learning. In addition to the learning difficulty from the imbalanced distribution, another major challenge is that the imbalanced rate in a data stream can be dynamically changing. OOB and UOB are two state-of-the-art methods for online class imbalance problems [1]. UOB is better at recognizing minority-class examples when the imbalance rate does not change much over time, while OOB is more prepared for the case with a dynamic rate. Aiming for an effective method for both static and dynamic cases, this paper proposes a multi-objective ensemble method MOSOB that combines OOB and UOB. MOSOB finds the Pareto-optimal weights for OOB and UOB at each time step, to maximize minority-class recall and majority-class recall simultaneously. Experiments on five real-world data applications show that MOSOB performs well in both static and dynamic data streams. Furthermore, we look into its performance on a group of highly imbalanced data streams. To respond to the minority class within 10000 time steps, the imbalance rate can be as low as 0.1% for easy data streams; at least 3% of imbalance rate is required to classify difficult data streams.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.