Abstract

Artificial neural networks (ANNs) is an exponentially growing field, mainly because of its wide range of applications to everyday life such as pattern recognition or time series forecasting. In particular, reservoir computing (RC) arises as an optimal computational framework suited for temporal/sequential data analysis. The direct on-silicon implementation of RCs may help to minimize power and maximize processing speed, that is especially relevant in edge intelligence applications where energy storage is considerably restricted. Nevertheless, most of the RC hardware solutions present in the literature perform the training process off-chip at the server level, thus increasing processing time and overall power dissipation. Some studies integrate both learning and inference on the same chip, although these works are normally oriented to implement unsupervised learning (UL) [with a lower expected accuracy than supervised learning (SL)], or propose iterative solutions (with a subsequent higher power consumption). Therefore, the integration of RC systems including both inference and a fast noniterative SL method is still an incipient field. In this article, we propose a noniterative SL methodology for RC systems that can be implemented on hardware either sequentially or fully parallel. The proposal presents a considerable advantage in terms of energy efficiency (EE) and processing speed if compared to traditional off-chip methods. In order to prove the validity of the model, a cyclic echo state NN with on-chip learning capabilities for time series prediction has been implemented and tested in a field-programmable gate array (FPGA). Also, a low-cost audio processing method is proposed that may be used to optimize the sound preprocessing steps.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.