Abstract
Support vector machine (SVM) is a kind of machine learning method, but the selection of parameters has important effects on the generalization ability of SVMs. In this study, the relation between the error penalty parameter C, kernel parameter σ and the generalization ability of SVMs is discussed. Parameter C adjusts the similarity among within-class members, while parameter σ adjusts the similarity between classes. Moreover, C and σ balances each other mutually within a certain range, which forms a fan-shaped optional parameter distribution region. The optimal parameter area should be located near the center of the sector where both C and σ are small. According to this, a method is suggested to first search a suitable area with coarse grids, and then determine the optimal parameter within the area with a fine bilinear grid. Experimental results show that the new parameter selection method can not only avoid local optima, and thus excluding the cases in which C and σ are big and unstable, but also can be extremely fast in searching process. Compared with other parameter selection methods, the performance of SVMs cannot be influenced, or even better in some cases.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Signal Processing, Image Processing and Pattern Recognition
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.