Abstract

Feature selection, a method of dimensionality reduction, is nothing but collecting a range of appropriate feature subsets from the total number of features. In this paper, a point by point explanation review about the feature selection in this segment preferred affairs and its appraisal techniques are discussed. I will initiate my conversation with a straightforward approach so that we consider taking care of features and preferred issues depending upon meta-heuristic strategy. These techniques help in obtaining the best highlight subsets. Thereafter, this paper discusses some system models that drive naturally from the environment are discussed and calculations are performed so that we can take care of the preferred feature matters in complex and massive data. Here, furthermore, I discuss algorithms like the genetic algorithm (GA), the Non-Dominated Sorting Genetic Algorithm (NSGA-II), Particle Swarm Optimization (PSO), and some other meta-heuristic strategies for considering the provisional separation of issues. A comparison of these algorithms has been performed; the results show that the feature selection technique benefits machine learning algorithms by improving the performance of the algorithm. This paper also presents various real-world applications of using feature selection.

Highlights

  • IntroductionFeature selection is consolidated in many fields. Feature selection or element selection is nothing but a selection of appropriate element subsets from the total number of elements

  • Nowadays, feature selection is consolidated in many fields

  • Feature selection minimizes the total number of features and selects only efficient features based on input provided by reducing noisy data, which helps in identifying that application quickly

Read more

Summary

Introduction

Feature selection is consolidated in many fields. Feature selection or element selection is nothing but a selection of appropriate element subsets from the total number of elements. It is noticed that by considering 2N features, we are undergoing NP-Hard issues To eliminate these issues, the obtained element subsets are further reduced to subsets by considering some searching techniques, which results in subsets based on heuristical subsets. Some emerging sequential search methods can be used in wrapper techniques, like the Genetic Algorithm (GA) [6] or Particle Swarm Optimization (PSO) [7]; these can result in feasible performance and optimized solutions. In Sequential Selection techniques, initially, a full set of data is given, and by applying the algorithm and based on the need, all the incompatible elements are removed, and the best solution is obtained, which is an optimized subset of features. We consider the training process and apply a greedy selection strategy to obtain optimized subsets

Meta-heuristic Approaches for Feature Selection
Selection of Feature Using Different Algorithms
Comparison
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call