Abstract

Group intelligence algorithms have been widely used in support vector machine (SVM) parameter optimization due to their obvious characteristics of strong parallel processing ability, fast optimization, and global optimization. However, few studies have made optimization performance comparisons of different group intelligence algorithms on SVMs, especially in terms of their application to hyperspectral remote sensing classification. In this paper, we compare the optimization performance of three different group intelligence algorithms that were run on a SVM in terms of five aspects by using three hyperspectral images (one each of the Indian Pines, University of Pavia, and Salinas): the stability to parameter settings, convergence rate, feature selection ability, sample size, and classification accuracy. Particle swarm optimization (PSO), genetic algorithms (GAs), and artificial bee colony (ABC) algorithms are the three group intelligence algorithms. Our results showed the influence of these three optimization algorithms on the C-parameter optimization of the SVM was less than their influence on the σ-parameter. The convergence rate, the number of selected features, and the accuracy of the three group intelligence algorithms were statistically significant different at the p = 0.01 level. The GA algorithm could compress more than 70% of the original data and it was the least affected by sample size. GA-SVM had the highest average overall accuracy (91.77%), followed by ABC-SVM (88.73%), and PSO-SVM (86.65%). Especially, in complex scenes (e.g., the Indian Pines image), GA-SVM showed the highest classification accuracy (87.34%, which was 8.23% higher than ABC-SVM and 16.42% higher than PSO-SVM) and the best stability (the standard deviation of its classification accuracy was 0.82%, which was 5.54% lower than ABC-SVM, and 21.63% lower than PSO-SVM). Therefore, when compared with the ABC and PSO algorithms, the GA had more advantages in terms of feature band selection, small sample size classification, and classification accuracy.

Highlights

  • A support vector machine (SVM) is a supervised nonparametric statistical learning technique that was first presented by Vapnik [1]

  • Particle swarm optimization (PSO) is an evolutionary algorithm that was developed by scholars Kennedy et al [44], which originated as a simulation of a bird flock, where each bird is considered as a “particle”

  • 33.1.1.2.2..IInnddiaiannPPinineessIImmaaggee The AVIRIS Indian Pines dataset are comprised of a hyperspectral image that was obtained withTTthhheee AAAVVirIIRbRoIISrSnIeInndVdiaiiasninbPlPeini/neIenssfdrdaaartetaadsseetItmaararegeicncoogmmSppprriesicseetdrdoomoffeataehrhyy(pApeVerrsIspRpeIeScct)trraaalnl idmimaaagggeerotthuhanattdwwtarasustohobbtitamainianegeded. wTwhitiethhAtthVheIeRAAISirirbIbnoodrrninaeneVPViisinsibiebslelei/m/IInnafgfrraearrweedadsIImamcaqaguginiinrgegdSSpopneecc1ttr2rooJmumneeettee1rr9(9(AA2VVoIvIRReIrISSt)h)aeannnddoarathggerrroonuunpndadrtttroruuftthIhnidmimiaaanggaee,..UTTShhAee

Read more

Summary

Introduction

A support vector machine (SVM) is a supervised nonparametric statistical learning technique that was first presented by Vapnik [1]. Traditional SVM parameter optimization methods include experimental methods [12], grid methods [9,25], and the gradient descent method [26,27] These algorithms have various problems (such as large time consumption, low efficiency, and low precision), which limits their ability to meet application requirements. In this paper we compare the optimization performance of three GI algorithms (including a GA, PSO, and an ABC algorithm) on a SVM in terms of five aspects using three popular used hyperspectral datasets: the stability to parameter settings, convergence rate, feature selection ability, sample size, and classification accuracy. This work provides reference for selecting the optimal SVM parameter optimization method

Artificial Bee Colony Algorithm
Genetic Algorithm
Particle Swarm Optimization
SVM Optimized with the GI Algorithms
Classification Results of the Three Hyperspectral Remote Sensing Datasets
Method
GI Algorithm Performance Comparison
The Impact of Sample Size on GI Algorithms’ Performance
Conclusions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.