Abstract

Echo State Network (ESN) is a specific form of recurrent neural network, which displays very rich dynamics owing to its reservoir based hidden neurons. In the issue, ESN is viewed as a powerful approach to model real-valued time series processes. Nevertheless, ESN has been criticized for its manually experienced or brute-force searching parameters, such as initial input weights and reservoir layer weights, i.e., the conventional randomly generated ESN is unlikely to be optimal because the reservoir layer weights and input layer weights are created randomly. Simple Cycle Reservoir Network (SCRN), which constitutes a type of conclusively constructed input and internal layer weights, can yield performance comparable with conventional ESN. A Redundant Unit Pruning Auto-Encoder (RUP-AE) algorithm is proposed to optimize the input layer weights of SCRN and for resolving the dilemma of ill-conditioned output weights matrix in SCRN, through an unsupervised pre-training process. Initially, the output weights matrix of SCRN is pre-trained by pseudo-inverse algorithm through training data. Then, the pre-trained output weights matrix is pruned by a Redundant Unit Pruning (RUP) algorithm. Finally, the pruned output weights matrix of SCRN is injected to the input weights matrix to ensure the specificity of the auto-encoder. Three tasks, namely nonlinear time series system identification task, real-valued time series benchmark, and standard chaotic time series benchmark, are applied to demonstrate the advantage and superiority of RUP-AE. Extensive experimental results show that our RUP-AE is effective in improving the performance of SCRN. Meanwhile, RUP-AE is able to resolve the dilemma of ill-conditioned output weights matrix in SCRN.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call