Abstract

Feature selection can classify the data with irrelevant features and improve the accuracy of data classification in pattern classification. At present, back propagation (BP) neural network and particle swarm optimization algorithm can be well combined with feature selection. On this basis, this paper adds interference factors to BP neural network and particle swarm optimization algorithm to improve the accuracy and practicability of feature selection. This paper summarizes the basic methods and requirements for feature selection and combines the benefits of global optimization with the feedback mechanism of BP neural networks to feature based on backpropagation and particle swarm optimization (BP-PSO). Firstly, a chaotic model is introduced to increase the diversity of particles in the initial process of particle swarm optimization, and an adaptive factor is introduced to enhance the global search ability of the algorithm. Then, the number of features is optimized to reduce the number of features on the basis of ensuring the accuracy of feature selection. Finally, different data sets are introduced to test the accuracy of feature selection, and the evaluation mechanisms of encapsulation mode and filtering mode are used to verify the practicability of the model. The results show that the average accuracy of BP-PSO is 8.65% higher than the suboptimal NDFs model in different data sets, and the performance of BP-PSO is 2.31% to 18.62% higher than the benchmark method in all data sets. It shows that BP-PSO can select more distinguishing feature subsets, which verifies the accuracy and practicability of this model.

Highlights

  • Image retrieval and other application fields are emerging [1]

  • In order to have an objective understanding of the performance of back propagation (BP)-PSO model, all features will be used as the benchmark in the experiment and compared with other five related adaptive feature selection algorithms. e control models involved in the experiment were baseline, lapscore, unsupervised discriminant feature selection (UDFs), NDFs, fsasl, and sogfs

  • If the best model of any feature selection algorithm and classification algorithm is used as the final model of the data set, the six cancer data sets can be improved by backpropagation and particle swarm optimization (BP-PSO) coding features. e accuracy of the final model PRAD and THCA has increased by 0.23%. e final model of the data set BRCA has an improved maximum accuracy of 0.36%

Read more

Summary

Introduction

Image retrieval and other application fields are emerging [1]. In these problems, the data is often cumbersome, and the number of features is large. erefore, there are higher requirements for feature selection, and feature selection methods for complex data emerge as the times require [2]. Image retrieval and other application fields are emerging [1] In these problems, the data is often cumbersome, and the number of features is large. Erefore, there are higher requirements for feature selection, and feature selection methods for complex data emerge as the times require [2]. These algorithms have certain search ability, the efficiency is not high, and the waste of resources is serious, so we need to have a more efficient search strategy applied to feature selection. Compared with the previous threshold method, information theory measure can improve the quality of feature subset [7]

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call