Abstract
AbstractReservoir Computing is a computing model ideal for performing computation on varied physical substrates. However, these physical reservoirs can be difficult to scale up. We propose joining various reservoirs together as an approach to solving this problem, simulating physical reservoirs with Echo State Networks (ESNs). We investigate various methods of combining ESNs to form larger reservoirs, including a method that we dub Restricted ESNs. We provide a notation for describing Restricted ESNs, and use it to benchmark a standard ESN against restricted ones. We investigate two methods to keep the weight matrix density consistent when comparing a Restricted ESN to a standard one, which we call overall consistency and patch consistency. We benchmark restricted ESNs on NARMA10 and the sunspot prediction benchmark, and find that restricted ESNs perform similarly to standard ones. We present some application scenarios in which restricted ESNs may offer advantages over standard ESNs. We then test restricted ESNs on a version of the multi-timescale Multiple Superimposed Sines tasks, in order to establish a baseline performance that can be improved upon in further work. We conclude that we can scale up reservoir performance by linking small homogeneous subreservoirs together without significant loss in performance over a single large reservoir, justifying future work on using heterogeneous subreservoirs for greater flexibility.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.