Abstract

As a variant of the support vector machine (SVM), the twin support vector machine (TSVM) has attracted substantial attention; however, TSVM is sensitive to outliers. To remedy it,this study introduces the correntropy-induced loss (C-loss) function, which is a non-convex, bounded, smooth loss function, and proposes the C-loss TSVM (CTSVM). CTSVM is designed to perform the following optimizations: minimization of the C-loss function, the 2-norm regularization of model coefficients, and the distance between the positive (negative) samples and the positive-class (negative-class) hyperplane, which ensures that negative (positive) samples are far away from the positive (negative) hyperplane. CTSVM is cast into two half-quadratic optimization problems that can be solved by using an alternating iterative method. This approach enables the computational time of CTSVM to remain comparable to that of related methods, especially when solving linear problems. Experimental results were processed statistically to compare the performance of multiple algorithms on multiple datasets. These results confirmed that CTSVM outperforms similar classifiers in terms of its robustness to outliers and classification accuracy for binary tasks. In the future, we aim to investigate the performance of CTSVM with respect to multi-class classification tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call