Abstract

Support Vector Machine (SVM) is one of the successful methods of machine learning. SVM uses kernel tricks to efficiently learn non-linear classification tasks. The kernel function is applied to each data instance to map the original non-linear observations into a higher-dimensional space in which they become separable. The Gaussian kernel in Radial-Basis Function (RBF) is a popular kernel function used in various kernelized learning methods. In the RBF kernel, the sigma parameter (bandwidth of kernel function) should be fine-tuned, which is a challenging job. If the sigma value is very small, the decision boundary is highly non-linear. On the other hand, if the sigma value is large, the decision boundary tends to be linear. In this paper, an efficient kernel method called the Apollonius SVM Kernel (ASVMK) is introduced. Our algorithm adjusts the width of the kernel based on the data density and the representativeness of each class. The main idea is to introduce a kernel without adjusting the bandwidth of the kernel based on landmark points and the Apollonius similarity function. ASVMK finds the correlation of dense points and their neighbors in the data and tries to map the original non-linear observations in the highest dimensional space from these features. Reducing memory problems and low computations due to the shrinkage of the gram matrix are some of the benefits of the proposed kernel. The proposed method outperformed the state-of-the-art methods by up to 11%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call