Abstract
Parameter Tuning is an indispensable step to guarantee generalization of support vector machines (SVM). Previous methods can be reduced to a nested two-layer framework, where the inner layer solves a convex optimization problem, and the outer layer selects the hyper-parameters by minimizing either cross validation error or other error bounds. In this paper, we propose a novel efficient parameter tuning approach via kernel matrix approximation, focusing on the efficiency improvement of SVM training in the inner layer. We first develop a kernel matrix approximation algorithm MoCIC. Then, we apply MoCIC to compute a low-rank approximation of the kernel matrix, and then use the approximate matrix to approximately solve the quadratic programming of SVM, and finally select the optimal candidate parameters through the approximate cross validation error (ACVE). We verify the feasibility and the efficiency of parameter tuning approach based on MoCIC on 5 benchmark datasets. Experimental results show that our approach can dramatically reduce time consumption of parameter tuning and meanwhile guarantee the effectiveness of the selected parameters.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.