Abstract

Reservoir computing (RC) is considered as a suitable alternative for descending gradient methods in recursive neural networks (RNNs) training. The echo state network (ESN) is a platform for RC and nonlinear system simulation in the cloud environment with many external users. In the past researches, the highest eigenvalue of reservoir connection weight (spectral radius) was used to predict reservoir dynamics. Some researchers have illustrated; the characteristics of scale-free and small-world can improve the approximation capability in echo state networks; however, recent studies have shown importance of the infrastructures such as clusters and the stability criteria of these reservoirs as altered. In this research, we suggest a high clustered ESN called HCESN that its internal neurons are interconnected in form of clusters. Each of the clusters contains one backbone and a number of local nodes. We implemented a classical clustering algorithm, called K-means, and three optimization algorithms including genetic algorithm (GA), differential evolution (DE), and particle swarm optimization (PSO) to improve the clustering efficiency of the new reservoir and compared them with each other. For investigating the spectral radius and predictive power of the resulting reservoirs, we also applied them to the laser time series and the Mackey-Glass dynamical system. It is demonstrated that new clustered reservoirs have some specifications of biologic neural systems and complex networks like average short path length, high clustering coefficient, and power-law distribution. The empirical results illustrated that the ESN based on PSO could strikingly enhance echo state property (ESP) and obtains less chaotic time series prediction error compared with other works and the original version of ESN. Therefore, it can approximate nonlinear dynamical systems and predict the chaotic time series.

Highlights

  • Unlike feed-forward neural networks, it is costly and challenging to train recurrent neural networks with traditional methods such as gradient descent in the presence of feedback loops

  • 31 2.755 2.191 2.121 2.195 2.194 2.178 evolutionary optimization algorithms called as HCESN-KM, HCESN-genetic algorithm (GA), HCESN-differential evolution (DE), and HCESNPSO

  • HCESN-particle swarm optimization (PSO) is recommended for reservoir design

Read more

Summary

Introduction

Unlike feed-forward neural networks, it is costly and challenging to train recurrent neural networks with traditional methods such as gradient descent in the presence of feedback loops. As mentioned by Jaeger [7], only provided the current reservoir state is exclusively specified by the long-time history of inputs after running; the ESN has echo state property (ESP). Deng and Zhang [26] suggested a complex ESN model with a gradual growth state reservoir called HCESN, which included a lot of internal nodes with sparse connections. Their experimental results showed that the echo state property could be improved by permitting a large-scale spectrum of the acceptable spectral radius. The empirical results show that the suggested methods, HCESN-PSO, outperform the previous ones in terms of the capability of approximating nonlinear dynamic systems and prediction accuracy of chaotic time series. The last section is devoted to conclusions

Related work
Generation of initial population
HCEAN’s reservoir architecture
Experimental results and discussion
Small-world property
Test criteria
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call