Abstract

Conventional classification assumes a balanced sample distribution among classes. However, such a premise leads to biased performance over the majority class (with the highest number of instances). The Twin Support Vector Machines (TWSVM) obtained great prominence due to their low computational burden compared to the standard SVM. Besides, traditional machine learning seeks methods whose solution depends on a convex problem or semi-positive definite similarity matrices. Yet, this kind of matrix cannot adequately represent many real-world applications. The above defines the need to use non-negative measures as an indefinite function in a Reproducing Kernel Kreĭn Space (RKKS). This paper proposes a novel approach called Kreĭn Twin Support Vector Machines (KTSVM), which appropriately incorporates indefinite kernels within a TWSVM-based gradient optimization. To code pertinent input patterns within an imbalanced data discrimination, our KTSVM employs an implicit mapping to a RKKS. Also, our approach takes advantage of the TWSVM scheme’s benefits by creating two parallel hyperplanes. This promotes the KTSVM optimization in a gradient-descent framework. Results obtained on synthetic and real-world datasets demonstrate that our solution performs better in terms of imbalanced data classification than state-of-the-art techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call