Abstract

The setting of parameters in the support vector machines (SVMs) is very important with regard to its accuracy and efficiency. In this paper, we employ the firefly algorithm to train all parameters of the SVM simultaneously, including the penalty parameter, smoothness parameter, and Lagrangian multiplier. The proposed method is called the firefly-based SVM (firefly-SVM). This tool is not considered the feature selection, because the SVM, together with feature selection, is not suitable for the application in a multiclass classification, especially for the one-against-all multiclass SVM. In experiments, binary and multiclass classifications are explored. In the experiments on binary classification, ten of the benchmark data sets of the University of California, Irvine (UCI), machine learning repository are used; additionally the firefly-SVM is applied to the multiclass diagnosis of ultrasonic supraspinatus images. The classification performance of firefly-SVM is also compared to the original LIBSVM method associated with the grid search method and the particle swarm optimization based SVM (PSO-SVM). The experimental results advocate the use of firefly-SVM to classify pattern classifications for maximum accuracy.

Highlights

  • The support vector machines (SVMs) have been widely used in many applications, including the decision-making application [1], forecasting malaria transmission [2], liver fibrosis diagnosis [3], and pattern classification [4]

  • The designed platform used to develop the firefly-SVM training algorithm was a personal computer with the Intel Pentium IV 3.0 GHz CPU, 2 GB RAM, using the Window XP operating system and the Visual C++ 6.0 together with an OPENCV library environment

  • We explore the uses of the firefly-SVM for binary and multiclass classification

Read more

Summary

Introduction

The support vector machines (SVMs) have been widely used in many applications, including the decision-making application [1], forecasting malaria transmission [2], liver fibrosis diagnosis [3], and pattern classification [4]. The SVM achieves the tradeoff between the minimum training set error and the maximization of the margin based on the Vapnik-Chervonenkis theory and structural risk minimization principle. It has the best generalization ability [5,6,7]. The setting of the parameters for the SVM classifier plays a significant role, which includes the penalty parameter C and the smoothness parameter γ of the radial-based function. The penalty parameter C maintains the balance between the fitting error minimization and model complexity. The smoothness parameter γ of the kernel function is used to determine the nonlinear mapping from the input space to the high-dimensional feature space

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call