Abstract

Laplacian support vector machine (LapSVM) has received much concern in semi-supervised learning (SSL) field. To further improve its computational speed, many efficient algorithms have been developed. However, they just focus on the method of solving optimization problem not the scale of the problem itself. Inspired by the sparsity of LapSVM, in this paper, an efficient safe screening rule for LapSVM (SSR-LapSVM) is proposed to address this issue. The proposed method could significantly accelerate the original LapSVM. Through the rule, most of the training samples can be eliminated before solving optimization problem without sacrificing the optimal solution. An important advantage is the safety, in the sense that the solution is exactly the same as the original LapSVM. Different from most existing methods, our approach can effectively deal with the multiple parameter problems. Experiments on both 3 artificial datasets and 24 real world benchmark datasets demonstrate its feasibility and efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call