Abstract

Feature selection is a challenging step in the field of data mining, because there are many local optimal solutions in a feature space. Feature selection can be considered an optimization problem, which requires as few feature combinations as possible and high accuracy. The binary symbiotic organism search (BSOS) algorithm is proposed in this paper. It maps the symbiotic organism search algorithm from a continuous space to a discrete space using an adaptive S-shaped transfer function and can be used to search for the optimal feature subset in a feature selection space. The proposed BSOS algorithm is evaluated using 19 datasets from the UCI repository. First, the results of four basic S-shaped transfer functions are compared with those of the adaptive S-shaped transfer function. Additionally, the experimental results are compared with the results obtained by the popular binary grasshopper optimization, binary gray wolf optimization, traditional binary particle swarm optimization, and binary differential evolution algorithms, which are also employed for feature selection in the existing literature. The experimental results show that the BSOS algorithm can find the fewest number of features in most datasets and achieve a high classification accuracy. Moreover, the experiments also show that the BSOS algorithm is still at a disadvantage in handling low-dimensional datasets and attains low sensitivity in hyperdimensional datasets.

Highlights

  • With the rapid development of the Information Age, the problem of data expansion has become increasingly more serious; this dimensionality problem must be handled through effective dimension reduction methods [1]

  • The proposed binary symbiotic organism search (BSOS) algorithm based on the adaptive S-shaped (AS) transfer method is compared with four other methods based on basic S transform functions (SOS-S1, symbiotic organism search (SOS)-S2, SOS-S3, and SOS-S4) in the task of feature selection result processing on each dataset

  • A comparison of the results of BSOS-based feature selection with the those of four other common transfer function methods shows that the BSOS algorithm can better search the feature selection space because AS considers the current iteration, which helps to balance the exploitation and exploration behaviors of the SOS algorithm

Read more

Summary

Introduction

With the rapid development of the Information Age, the problem of data expansion has become increasingly more serious; this dimensionality problem must be handled through effective dimension reduction methods [1]. Feature selection, which is an important step in data mining pretreatment, can effectively eliminate redundant data and extract informative related data. It is widely used in data prediction and analysis [2]. As the name implies, selects subsets with as few attribute features as possible from all attributes [3]. The selected feature subset contains enough effective information to replace the feature information of all attributes [4].

Objectives
Methods
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call