Abstract

The usage of Artificial Neural Networks (ANNs) with a fixed topology is becoming more popular in daily life. However, there are problems where it is difficult to build an ANN manually. Therefore, genetic algorithms like NeuroEvolution of Augmented Topologies (NEAT) have been developed to find topologies and weights. The downside of NEAT is that it often generates inefficient large ANNs for different problems. In this paper, we introduce an approach called Turbo NEAT, which combines divide and conquer methods with NEAT to allow a symbiosis of specialised smaller ANNs. In addition, we optimise the weights of the ANNs through backpropagation in order to better compare the topologies. Experiments on several problems show that these approaches allow the handling of complex problems and lead to efficient ANNs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call