Abstract

Convolutional Neural Networks (CNNs) have proven effective for machine learning tasks such as computer vision. Analog, asynchronous hardware implementations of such neural networks appear to be promising avenues for fast, online, real-time, energy efficient machine learning. However, the weight-sharing requirements of CNNs present challenges for such neuromorphic designs. We propose a biologically plausible method of implementing CNN weight sharing by convolving over time via recurrent synaptic connections. As a case study, we design a Recurrent Convolutional Neural Network (RCNN) for classification on the MNIST dataset. Our RCNN competes comparably with traditional CNN architectures while also providing significant area, storage, and connectivity advantages that qualify it for neuromorphic implementations capable of computing on analog and/or time-based signals.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call