Abstract

Abstract Reservoir Computing (RC) is an effective approach to design and train recurrent neural networks, which is successfully and widely applied in real-valued time series modeling tasks. However, RC has been criticized for not being principled enough, namely the reservoir which is unlikely to be optimal because the reservoir connectivity and weight structure are created randomly. A new Simple Cycle Reservoir Network (SCRN) with deterministically constructed connectivity and weight structure can yield performance competitive with standard Echo State Network (ESN). In order to determine the proper size of the reservoir and improve generalization ability of SCRN, a Sensitive Iterated Pruning Algorithm (SIPA), in which a larger than necessary reservoir is employed firstly and then its size is reduced by pruning out the least sensitive internal units, is proposed to optimize the reservoir size and weights of SCRN. A system identification and two time-series benchmark tasks are applied to demonstrate the feasibility and superiority of SIPA. The results show that the SIPA method significantly outperforms a Least Angle Regression (LAR) method and SIPA is able to improve the generalization performance of SCRN. Besides, two well known reservoir characterizations, i.e. pseudo-Lyapunov exponent of the reservoir dynamics and Memory Capacity, and the impact of SIPA on two characterizations are investigated.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.