Abstract

Recurrent neural networks (RNNs) are successfully employed in processing information from temporal data. Approaches to training such networks are varied and reservoir computing-based attainments, such as the echo state network (ESN), provide great ease in training. Akin to many machine learning algorithms rendering an interpolation function or fitting a curve, we observe that a driven system, such as an RNN, renders a continuous curve fitting if and only if it satisfies the echo state property. The domain of the learned curve is an abstract space of the left-infinite sequence of inputs and the codomain is the space of readout values. When the input originates from discrete-time dynamical systems, we find theoretical conditions under which a topological conjugacy between the input and reservoir dynamics can exist and present some numerical results relating the linearity in the reservoir to the forecasting abilities of the ESNs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call