Abstract

Feature noise, namely noise on inputs is a long-standing plague to support vector machine(SVM). Conventional SVM with the hinge loss(C-SVM) is sparse but sensitive to feature noise. Instead, the pinball loss SVM(pin-SVM) enjoys noise robustness but loses the sparsity completely. To bridge the gap between C-SVM and pin-SVM, we propose the truncated pinball loss SVM(pin¯-SVM) in this paper. It provides a flexible framework of trade-off between sparsity and feature noise insensitivity. Theoretical properties including Bayes rule, misclassification error bound, sparsity, and noise insensitivity are discussed in depth. To train pin¯-SVM, the concave-convex procedure(CCCP) is used to handle non-convexity and the decomposition method is used to deal with the subproblem of each CCCP iteration. Accordingly, we modify the popular solver LIBSVM to conduct experiments and numerical results validate the properties of pin¯-SVM on the synthetic and real-world data sets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call