Abstract

This paper focuses on the problem of optimizing the kernel and penalty parameters for SVM classifiers with Gaussian kernel. To reduce the computational overhead of inter-cluster distance in the feature space (ICDF) with a large number of candidate discretized values in a large interval in previous researches, in this paper, the new inter-cluster induced distance in the feature space (ICIDF) is proposed to guide the kernel parameter selection of SVMs, and the theorem that the ICIDF is a positive strictly unimodal function about Gaussian kernel parameter is firstly presented. Then, a fast parameter optimization approach including two stages is presented for SVMs according to this theorem. In the first stage, a modified golden section algorithm (MGSA) is proposed to obtain a shrunk value interval for kernel parameter in small amount of ICIDF calculations. In the second stage, a differential evolutionary algorithm (BBDE or SADE) is applied to select the best parameter combination for SVM in the shrunk interval of kernel parameter obtained by MGSA and a given interval of penalty parameter. Experiments for benchmark datasets illustrate that the training time of SVM models can significantly shortened by our approach, while the testing accuracy of the trained SVMs is competitive.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call