Abstract

Reservoir computers are a fast training variant of recurrent neural networks, excelling at approximation of nonlinear dynamical systems and time series prediction. These machine learning models act as self-organizing nonlinear fading memory filters. While these models benefit from low overall complexity, the matrix computations are a complexity bottleneck. This work applies the controllability matrix of control theory to quickly identify a reduced size replacement reservoir. Given a large, task-effective reservoir matrix, we calculate the rank of the associated controllability matrix. This simple calculation identifies the required rank for a reduced size replacement, resulting in time speed-ups to an already fast deep learning model. Additionally, this rank calculation speaks to the state space reachable set required to model the input data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call