Abstract
Classification problem is one of the essential tasks in data mining. Traditional classification strategies are predominantly via cost-insensitive equilibrium data. They tend to be concentrated on the overall accuracy of a model, and such classifiers are improper for unbalanced sample data. Hence, optimizing unbalanced samples to improve classifier performance is an issue worthy of discussion. Based on the information-rich minority samples that are difficult to learn, Majority Weighted Minority Oversampling Technique (MWMOTE) uses the clustering method to generate synthetic samples from the weighted information samples. However, the accuracy of the clustering should be optimized. To this end, a method called NC_Link_MWMOTE is presented for efficiently handling imbalanced learning problems. We propose a solution by using NC_Link-based hierarchical clustering method to synthesize different samples from a small number of samples, thus optimizing the clustering effect. NC_Link_MWMOTE was evaluated on six different levels of equilibrium data sets. The simulation results show that our method is effective and outperforms competitive baseline method in terms of various assessment metrics, such as Fl-score and Area Under Curve (AUC).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.