Abstract

Echo State Networks, ESNs, are standardly composed of additive units undergoing sigmoid function activation. They consist of a randomly recurrent neuronal infra-structure called reservoir. Coming up with a good reservoir depends mainly on picking up the right parameters for the network initialization. Human expertise as well as repeatedly tests may sometimes provide acceptable parameters. Nevertheless, they are non-guaranteed. On the other hand, optimization techniques based on evolutionary learning have proven their strong effectiveness in unscrambling optimal solutions in complex spaces. Particle swarm optimization (PSO) is one of the most popular continuous evolutionary algorithms. Throughout this paper, a PSO algorithm is associated to ESN to pre-train some fixed weights values within the network. Once the network's initial parameters are set, some untrained weights are selected for optimization. The new weights, already optimized, are re-squirted to the network which launches its normal training process. The performances of the network are a subject of the error and the time processing evaluation metrics. The testing results after PSO pre-training are compared to those of ESN without optimization and other existent approaches. The conceived approach is tested for time series prediction purpose on a set of benchmarks and real-life datasets. Experimental results show obvious enhancement of ESN learning results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call