Abstract

Neuroevolution of Augmenting Topologies (NEAT) has been a very successful algorithm for evolving Artificial Neural Networks (ANNs) that adapt their structure and processing to the task that is required from them. However, this algorithm is not always reliable when handling time related processes and this may be due to its lack of explicit temporal elements within its structure. Of course, NEAT can handle time dependent phenomena through the use of recurrences within the networks it builds, but it is well known that simple recurrences do not easily allow for precise temporal processing due to the history effect they induce on the networks. Many authors have argued for the introduction of other mechanisms, which are also present in natural systems, such as variable or trainable propagation delays in the synapses of the networks that must deal with precise temporal processing. In this paper, we carry out an initial study of a new implementation of NEAT called ?-NEAT that includes the possibility of introducing variable delays in the synapses of the networks NEAT constructs. These delays can affect both, regular direct synapses or recurrent connections. To evaluate the performance of this implementation several tests are carried out over different types of temporal functions and the results of the traditional version of NEAT and ?-NEAT are compared.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call