Abstract

Support vector machine is one of the most used machine learning algorithms with a comprehensive mathematical infrastructure. The power behind the algorithm is the kernel trick that enables the model to overcome non-linear data distributions by using functions that satisfy the Mercer condition. Undoubtedly, the radial basis function (RBF) is among the most widely used of these functions. The RBF kernel, which has a Gaussian curve, performs local boundary surfaces and supports the generalization capability of the model. In this study, a novel kernel function named Logarithmic Kernel Function (LKF), which has a Gaussian-like curve is presented. The crucial contribution of the LKF is that it can model the dataset better than other similarly shaped kernels in the case of a few training samples (10% and 30%). In the study, 6 classifications and 5 regression sets are handled to compare kernels. Instead of giving the hyperparameters needed by the models manually, they are estimated through the Tree Parzen Estimator, which is based on Sequential Modeling Optimization. This estimation process is repeated 10 times and the average results are obtained. The proposed LKF surpasses the most competitive kernels in various classification and regression tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call