Abstract

Echo State Network (ESN) is a specific class of recurrent neural networks, which displays very rich dynamics owing to its reservoir based hidden neurons. ESN has been viewed as a powerful approach to model real-valued time series processes. In order to integrate with deep learning theory, Deep Belief Echo State Network (DBESN) is employed to address the slow convergence in Deep Belief Network (DBN). In DBESN, the DBN part is employed for feature learning in an unsupervised fashion and the ESN part is utilized as a regression layer of DBN. However, the ESN input layer is still not working in an unsupervised status in DBESN. Moreover, ESN’s input dimension increases dramatically because of the DBN layer in DBESN. Namely, the DBN layer in DBESN makes the ESN more difficult to construct the input scaling parameters. For purpose of constructing an optimal input weights matrix and input scaling parameters in the ESN layer of DBESN, a novel Sensitivity Analysis Input Scaling Auto-Encoder (SAIS-AE) algorithm is employed in this paper through an unsupervised pre-training process. Initially, the output weights matrix of ESN layer is pre-trained by total input data set. Then, the pre-trained output weights matrix is injected into the input weights matrix of the ESN layer to ensure the specificity of AE. Finally, the input scaling parameters of ESN layer are tuned based on a sensitivity analysis algorithm. Two multivariable sequence tasks and one univariate sequence benchmark are applied to demonstrate the advantage and superiority of SAIS-AE. Extensive experimental results show that our SAIS-AE-DBESN model can effectively improve the performance of DBESN.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call