Abstract

Echo state networks (ESNs) are reservoir computing-based recurrent neural networks widely used in pattern analysis and machine intelligence applications. In order to achieve high accuracy with large model capacity, ESNs usually contain a large-sized internal layer (reservoir), making the evaluation process too slow for some applications. In this work, we speed up the evaluation of ESN by building a reduced network called the fast ESN (fastESN) and achieve an ESN evaluation complexity independent of the original ESN size for the first time. FastESN is generated using three techniques. First, the high-dimensional state of the original ESN is approximated by a low-dimensional state through proper orthogonal decomposition (POD)-based projection. Second, the activation function evaluation number is reduced through the discrete empirical interpolation method (DEIM). Third, we show the directly generated fastESN has instability problems and provide a stabilization scheme as a solution. Through experiments on four popular benchmarks, we show that fastESN is able to accelerate the sparse storage-based ESN evaluation with a high parameter compression ratio and a fast evaluation speed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call