Abstract

As a class of recurrent neural networks (RNNs), echo state networks (ESNs) have been studied extensively in recent years, particularly in time-series prediction and non-linear system identification. It is well-known that deep learning (DL) has been applied to ESNs and shows more outstanding performance than conventional ESNs. However, due to the large-scale dataset and complexity required to train deep learning models, the time required to train deep ESNs based on traditional centralized algorithms is highly extended. Hence, a decentralized training algorithm combining DL and ESNs is presented in this paper to aim at multiple time scales processing of temporal data. We propose this algorithm based on an optimization program called the alternating direction method of multipliers (ADMM) and decentralized average consensus (DAC) procedure. The results based on a large-scale artificial dataset prove that it has more advantages in generalization accuracy and training time than the centralized experimental scheme.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call