Abstract

Reservoir computing (RC), represented by echo state networks (ESN), is a kind of novel recurrent neural networks (RNN), which is increasingly being used in classification, chaotic time series prediction, speech recognition etc. ESN has a large number of randomly connected neurons (called “reservoir”) and an adaptable output. The short-term memory of reservoir has much effect on the performance of ESN. However, due to the way that the neurons in reservoir are randomly connected, the relationship between the topological structure of reservoir and short-term memory in ESN is not yet fully understood. In this paper, we establish a direct relationship between memory of the neural network and its connectivity. We transform the iterative mathematical model of ESN to direct one. In this model, we can determine the reservoir topology from short-term memory in ESN inversely. Furthermore, we find that some reservoir topologies proposed by pervious papers are the special solutions of our method.KeywordsReservoir computingEcho state networksMemory capabilityReservoir topology

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call