Abstract

Echo state networks (ESNs) are one of two major neural network models belonging to the reservoir computing framework. Traditionally, only the weights connecting to the output neuron, termed read-out weights, are trained using a supervised learning algorithm, while the weights inside the reservoir of the ESN are randomly determined and remain unchanged during the training. In this paper, we investigate the influence of neural plasticity applied to the weights inside the reservoir on the learning performance of the ESN. We examine the influence of two plasticity rules, anti-Oja's learning rule and the Bienenstock–Cooper–Munro (BCM) learning rule on the prediction and classification performance when either offline or online supervised learning algorithms are employed for training the read-out connections. Empirical studies are conducted on two widely used classification tasks and two time series prediction problems. Our experimental results demonstrate that neural plasticity can more effectively enhance the learning performance when offline learning is applied. The results also indicate that the BCM rule outperforms the anti-Oja rule in improving the learning performance of the ENS in the offline learning mode.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.