Abstract

Support Vector Machine is a desired method for classification of different types of data, but the main obstacle to using this method is the considerable reduction of classification speed upon increase in the size of problem. In this paper, a new kernel function is proposed for SVM which is derived from Hermite orthogonal polynomials. This function improves classification accuracy as well as reduction in the number of support vectors and increases the classification speed. The overall kernel performance has been evaluated on real world data sets from UCI repository by the ten-fold cross-validation method. The combinations of Hermite function with common kernels are proposed in this paper. Experimental results reveal that the Hermite–Chebyshev kernel which is obtained from combination of Hermite and Chebyshev kernel, has the best performance in the number of support vectors and Hermite–Gussian kernel that is produced from combination Hermite and Gussian kernel, has the first rank in the error rate among all experimental kernels. On the other hand the Hermite proposed method has the least number of support vectors and best performance in the error rate in comparison with common kernels, and lowest required time among all kernels.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.