Abstract

The Echo State Network (ESN) is one of the most used machine learning methods for predicting chaotic time series. The topology of an ESN comprises three layers, namely: input, hidden, and output layer. In this work, different ESN topologies are analyzed by modifying the structure and number of internal connections in the hidden layer, which neurons are connected randomly. The ESN topologies are analyzed to have 2%, 6%, 20%, 40%,…, and up to 100% of internal connections. The prediction error is associated to the Mean Squared Error (MSE), which results have similar values for all ESN topologies, as long as the Echo state property is guaranteed. It is highlighted that the prediction horizon obtained with 2% of internal connections, is more significant than that obtained with 100%, providing a prediction of 80 and 60 steps-ahead, respectively. The prediction results significantly impact the implementation in a field-programmable gate array (FPGA), so that an enhanced hardware realization is introduced herein by reducing the use of multipliers by more than 90% compared with a fully connected topology, without significantly affecting the measure of the prediction horizon and MSE. Another contribution is the FPGA implementation of the activation function that is an hyperbolic tangent function for which different alternatives are shown to minimize the error of the approximated function without significantly increasing the implementation cost or latency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call