Abstract

KSVCR is an effective algorithm to handle multiclass problems. But it cannot do variable selection and is time-consuming on large datasets. In this article, we propose a doubly sparse multiclass model DKSVCR which employs elastic net regularization term to improve the performance of KSVCR. And then, motivated by the sparsity of DKSVCR, a simultaneous safe feature and sample screening rule MFSS is further constructed to accelerate the solving speed of DKSVCR, which is termed as MFSS-DKSVCR. It has two major benefits. First, both classification and variable selection could be realized, the highly correlated features tend to be selected or removed together. Second, by using feature screening and sample screening rule alternatively rather than using each of them individually, our MFSS-DKSVCR can delete more redundant features and samples before the training stage. Hence, the solving speed is improved a lot. And our MFSS-DKSVCR is safe in the sense that the solutions obtained from the reduced problem and original problem are identical. Besides, a fast algorithm SDCA is used to solve the problem more efficiently. The experimental results on one artificial dataset, 28 benchmark datasets, and an image dataset verify the validity of our method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call