Abstract

In this paper, a novel search operation is proposed for the neuroevolution of augmented topologies, namely the difference-based mutation. This operator uses the differences between individuals in the population to perform more efficient search for optimal weights and structure of the model. The difference is determined according to the innovation numbers assigned to each node and connection, allowing tracking the changes. The implemented neuroevolution algorithm allows backward connections and loops in the topology, and uses a set of mutation operators, including connections merging and deletion. The algorithm is tested on a set of classification problems and the rotary inverted pendulum control problem. The comparison is performed between the basic approach and modified versions. The sensitivity to parameter values is examined. The experimental results prove that the newly developed operator delivers significant improvements to the classification quality in several cases, and allow finding better control algorithms.

Highlights

  • The development of machine learning in recent decades has resulted in several important findings, which allowed artificial neural networks (NN) to become one of the most widely used tools [1]

  • The described encoding scheme, proposed in [9] allows minimal starting for all networks, and performs the search in the space of small dimensionality, which is different from Topology and Weight Evolving Artificial Neural Networks (TWEANNs) or genetic programming (GP), where solutions could be quite complicated even at the first generation

  • The proposed difference-based mutation uses the innovation numbers stored in every connection in the neuroevolution of augmented topologies (NEAT) architecture to find the corresponding ones, and performs mutation similar to the one used in differential evolution

Read more

Summary

Introduction

The development of machine learning in recent decades has resulted in several important findings, which allowed artificial neural networks (NN) to become one of the most widely used tools [1]. Some of the most popular approaches are based on evolutionary algorithms (EAs), which use the concept of natural evolution to perform a direct search for solutions. Examples of such could be found in [5,6,7], where neural networks are evolved for image classification, showing competitive results compared to hand-designed architectures. This study is focused on developing new search operators for the NEAT approach, namely the difference-based mutation This operator allows a more efficient search for optimal weights in the neural network by combining the information stored in different individuals of the population together. The rest of the paper is organized as follows: Section 2 contains the main information about the NEAT algorithm used in this study, as well as the specific encoding and mutation operators and describes the proposed mutation operator, Section 3 contains the experimental setup and results, and Section 4 contains the conclusion

Neuroevolution
Encoding Scheme Used in NEAT
Crossover and Mutation Operators Used in NEAT
The General Scheme of the NEAT Algorithm
Proposed Mutation Operation
Results
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call