Abstract

The quantitative analysis of neural networks is a critical issue to improve performance. In this work, we extend the ideas of this adaptation method to the more commonly used behavioural spaces. The aim is to construct an echo state network (ESN) behavioural space through the generalization rank, kernel rank, and memory capacity. After deriving behavioural spaces, we investigate the influencing factors and methods of reducing the search complexity of behavioural spaces. This investigation reveals that the rule converges to the expected space distributions, even in a random ESN. We propose an optimization algorithm that adopts the novel search genetic algorithm (NSGA), which combines the novelty and quality of individuals. In the novel search process, because the time needed to construct the behaviour space is much lower than the network training time, the network optimization efficiency is greatly improved. According to the characteristics of the behaviour space distribution, we propose a method to shrink the search space to solve the problem of a large search space. This method overcomes the difficulties in traditional ESN parameter selection and the long optimization time of a genetic algorithm and provides behavioural space theory to clarify the influence of reservoir performance on tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call