Abstract
Deep networks have gained great momentum in the neural network research community. In deep networks, each layer represents an abstract feature of input data, and information is propagated in a feedforward or a circular fashion through recurrent connections between layers. Training such deep networks requires a complex algorithm. Deep Reservoir Computing (RC) is an alternative to overcome the training complexity of deep networks. In deep RC, the deep network (or reservoir) remains untrained and training only takes place at an output node with a simple algorithm. So far, deep RC was a software-based approach in which the traditional Echo State Networks (ESNs) serve as computing layers within a deep RC structure. Here, we propose a hardware-based platform for deep RC using memcapacitive networks. Our simulation results demonstrate that deep memcapacitive RC is able to compete with the state-of-the-art deep ESN and requires 3.45× fewer layers to accomplish similar tasks. Our deep memcapacitive RC networks offer a potential platform for building novel neuromorphic hardware.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have