Abstract

Support Vector Machine (SVM) is one of popular supervised machine learning algorithms, which can be used for both regression or classification challenges. The operation of SVM algorithm is based on finding the optimal hyperplane to discriminate between different classes. This hyperplane is known as kernel. In SVM, penalty parameter C and \(\sigma \) parameter of Radial Basis Function (RBF) can have a significant impact on the complexity and performance of SVM. Usually these parameters are randomly chosen. However, SVM is highly needed to determine the optimal parameters values to obtain expected learning performance. In this chapter, an optimization method based on optimal foraging theory is proposed to adjust the two main parameters of gaussian kernel function of SVM to increase the classification accuracy. Six well-known benchmark datasets taken from UCI machine learning data repository were employed for evaluating the proposed (OFA-SVM). In addition, the performance of the proposed optimal foraging algorithm for SVM’s parameters optimization (OFA-SVM) is compared with five other well-known and recently meta-heuristic optimization algorithms. These algorithms are Bat Algorithm (BA), Genetic Algorithm (GA), Artificial Bee Colony (ABC), Chicken Swarm Optimization (CSO) and Particle Swarm Optimization (PSO). The experimental results show that the proposed OFA-SVM can achieve better results compared with the other algorithms. Moreover, the results demonstrate the capability of the proposed OFA-SVM in finding the optimal parameters values of RBF of SVM.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call