Abstract

An echo state network (ESN) is a reservoir computing framework consisting of an input layer, a reservoir, and a readout layer. A reservoir is a recursive network comprising neuron models. In the reservoir, various models use analog neurons with a logistic output function. Time-series learning for ESNs requires a high memory capacity for storing the historical information of past inputs. However, analog neurons are incapable of storing time history by themselves. Therefore, historical information gained by introducing internal neural dynamics can enhance the memory capacity of ESNs. In this context, we hypothesized that the evaluation of the functions of internal-neural decay factors and optimal balances between the decay factors of chaotic neurons can provide useful results for improving the performance of ESNs comprising chaotic neural networks. Therefore, to validate this hypothesis, we investigated the performance of an ESN using a reservoir comprising a chaotic neural network (ChESN). The ChESN significantly outperformed a conventional ESN owing to its high memory capacity at a large temporal scale, even when the spectral radius of the reservoir synaptic weight was small. The proposed approach is expected to have wide applications in reservoir computing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call