Abstract

The imbalanced classification problem has always been one of the important challenges in neural network and machine learning. As an effective method to deal with imbalanced classification problems, the synthetic minority oversampling technique (SMOTE) has its disadvantage: Some noise samples may participate in the process of synthesizing new samples; As a result, the new synthetic sample lacks its rationality, which will reduce the classification performances of the network. To remedy this shortcoming, two novel improved SMOTE method are proposed in this paper: Center point SMOTE (CP-SMOTE) method and Inner and outer SMOTE (IO-SMOTE) method. The CP-SMOTE method generates new samples based on finding several center points, then linearly combining the minority samples with their corresponding center points. The IO-SMOTE method divides minority samples into inner and outer samples, and then uses inner samples as much as possible in the subsequent process of generating new samples. Numerical experiments are conducted to prove that compared with no-sampling and conventional SMOTE methods, the CP-SMOTE and IO-SMOTE methods can achieve better classification performances.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.