Abstract

ABSTRACTTwin K-class support vector classification (TKSVC) adopts ‘One-vs.-One-vs.-Rest’ structure to utilise all the samples to increase the prediction accuracy. However, TKSVC is sensitive to noises or outliers due to the use of the Hinge loss function. To reduce the negative influence of outliers, in this paper, we propose a more robust algorithm termed as Ramp loss for twin K-class support vector classification (Ramp-TKSVC) where we use the Ramp loss function to substitute the Hinge loss function in TKSVC. Because the Ramp-TKSVC is a non-differentiable non-convex optimisation problem, we adopt Concave–Convex Procedure (CCCP) to solve it. To overcome the drawbacks of conventional multi-classification methodologies, the TKSVC is utilised as a core of our Ramp-TKSVC. In the Ramp-TKSVC, the outliers are prevented from becoming support vectors, thus they are not involved in the construction of hyperplanes, making the Ramp-TKSVC more robust. Besides, the Ramp-TKSVC is sparser than the TKSVC. To verify the validity of our Ramp-TKSVC, we conduct experiments on 12 benchmark datasets in both linear and nonlinear cases. The experimental results indicate that our algorithm outperforms the other five compared algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call