Abstract

Imbalanced learning, an important learning technique to cope with learning cases of one class outnumbering another, has caught many interests in the research community. A newly developed physical-inspired classification method, i.e., the data gravitation-based classification (DGC) model, performs well in many general classification problems. However, like other general classifiers, the performance of DGC suffers for imbalanced tasks. Therefore, we develop a data level imbalanced learning DGC model namely SMOTE-DGC in this paper. An over sampling technique, Synthetic Minority Over-sampling Technique (SMOTE), is integrated with DGC model to improve the imbalanced learning performances. A total of 44 imbalanced classification data sets, several standard and imbalanced learning algorithms are used to evaluate the performance of the proposal. Experimental results suggest that the adapted DGC model is effective for imbalanced problems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call