Abstract

Deep convolutional neural networks (DCNNs) have achieved state-of-the-art performance in classification, natural language processing (NLP), and regression tasks. However, there is still a great gap between DCNNs and the human brain in terms of computation efficiency. Inspired by neural synaptic plasticity and stochastic computing (SC), we propose neural synaptic plasticity-inspired computing (NSPC) to simulate the human brain’s neural network activity for inference tasks with simple logic gates. The multiplication and accumulation (MAC) is transformed by the wire connectivity in NSPC, which only requires bundles of wires and small width adders. To this end, the NSPC imitates the structure of neural synaptic plasticity from a circuit wires connection perspective. Furthermore, from the principle of NSPC, we use a data mapping method to convert the convolution operations to matrix multiplications. Based on the methodology of NSPC, fully-pipelined and low latency architecture is designed. The proposed NSPC accelerator exhibits high hardware efficiency while maintaining a comparable network accuracy level. The NSPC based DCNN accelerator (NSPC-CNN) processes DCNN at $1.5625M$ images/ $s$ with a power dissipation of $15.42~W$ and an area of $36.4~mm^{2}$ . The NSPC based deep neural network (DNN) accelerator (NSPC-DNN) that implements three fully connected layers DNN consumes only $6.6~mm^{2}$ area and $2.93~W$ power, and achieves a throughput of $400M$ images/ $s$ . Compared with conventional fixed-point implementations, the NSPC-CNN achieves $2.77 \times $ area efficiency, $2.25 \times $ power efficiency; the proposed NSPC-DNN exhibits $2.31 \times $ area efficiency and $2.09 \times $ power efficiency.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.