Abstract

The imbalanced classification problem has always been one of the important challenges in neural network and machine learning. As an effective method to deal with imbalanced classification problems, the synthetic minority oversampling technique (SMOTE) has its disadvantage: Some noise samples may participate in the process of synthesizing new samples; As a result, the new synthetic sample lacks its rationality, which will reduce the classification performances of the network. To remedy this shortcoming, two novel improved SMOTE method are proposed in this paper: Center point SMOTE (CP-SMOTE) method and Inner and outer SMOTE (IO-SMOTE) method. The CP-SMOTE method generates new samples based on finding several center points, then linearly combining the minority samples with their corresponding center points. The IO-SMOTE method divides minority samples into inner and outer samples, and then uses inner samples as much as possible in the subsequent process of generating new samples. Numerical experiments are conducted to prove that compared with no-sampling and conventional SMOTE methods, the CP-SMOTE and IO-SMOTE methods can achieve better classification performances.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call