Abstract

The larger the size of the data, structured or unstructured, the harder to understand and make use of it. One of the fundamentals to machine learning is feature selection. Feature selection, by reducing the number of irrelevant/redundant features, dramatically reduces the run time of a learning algorithm and leads to a more general concept. In this paper, realization of feature selection through a neural network based algorithm, with the aid of a topology optimizer genetic algorithm, is investigated. We have utilized NeuroEvolution of Augmenting Topologies (NEAT) to select a subset of features with the most relevant connection to the target concept. Discovery and improvement of solutions are two main goals of machine learning, however, the accuracy of these varies depends on dimensions of problem space. Although feature selection methods can help to improve this accuracy, complexity of problem can also affect their performance. Artificialneural networks are proven effective in feature elimination, but as a consequence of fixed topology of most neural networks, it loses accuracy when the number of local minimas is considerable in the problem. To minimize this drawback, topology of neural network should be flexible and it should be able to avoid local minimas especially when a feature is removed. In this work, the power of feature selection through NEAT method is demonstrated. When compared to the evolution of networks with fixed structure, NEAT discovers significantly more sophisticated strategies. The results show NEAT can provide better accuracy compared to conventional Multi-Layer Perceptron and leads to improved feature selection.

Highlights

  • Feature selection is a process commonly used in machine learning, wherein aHow to cite this paper: Sohangir, S., et al (2014) NeuroEvolutionary Feature Selection Using NeuroEvolution of Augmenting Topologies (NEAT)

  • As Utans using MSE and this value is not feasible for NEAT, in this paper we propose another feature selection method based on Utans

  • In order to empirically evaluate NEAT for feature selection as implemented by our approximate algorithm, we ran a number of experiments on both artificial and real-world data

Read more

Summary

Introduction

Feature selection ( known as subset selection) is a process commonly used in machine learning, wherein aHow to cite this paper: Sohangir, S., et al (2014) NeuroEvolutionary Feature Selection Using NEAT. The best subset contains the least number of dimensions that most contribute to accuracy; the remaining and unimportant dimensions are disregarded This is an important stage of pre-processing and is one of the two ways of avoiding the curse of dimensionality (the other one is feature extraction). From a different point of view, feature selection is categorized as node pruning and statistical pattern recognition (SPR) (Figure 2). If these two categorizations are considered together, combination of Artificial Neural Network (ANN) and backward feature selection could be considered as a powerful solution. When backward feature selection is used, more local minima is added when removing features which leads to less accuracy To solve this problem and avoid local minima ANN topology can be improved using complexification [1]. Gene duplication is a possible explanation of how natural evolution expanded the size of genomes throughout evolution, and provides inspiration for adding new genes to artificial genomes as well

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.