Abstract

Inspired by the evolution of biological nervous systems, Neuroevolution (NE) is an approach to Artificial Intelligence (AI) which uses evolutionary algorithms to evolve complex artificial neural networks capable of intelligent behavior. Kenneth O' Stanley et al. proposed an extension of this approach called Neuroevolution of Augmenting Topologies (NEAT) [1], which evolves both topology and parameters. It possesses key features such as complexification, avoiding competing conventions via historical markings, speciation and fitness sharing. Over the years, the performance of NEAT has been improved with better approaches such as HyperNEAT and CoDeepNEAT. Better training methods for NEAT have evolved over the years as well. In this paper, we deduce an analysis of the efficiency and performance of the various algorithms which have been proposed for Topology and Weight Evolving Artificial Neural Networks (TWEANNs). A survey on the existing approaches based on their applications and purpose, along with the various training methods and the challenges incurred has also been discussed. This work will provide learners with a better overview of the past and current research trends in the field of Neuroevolution.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call