Abstract

Deep Neural Networks (DNNs) have been found to outperform conventional programming approaches in several applications such as computer vision and natural language processing. Efficient hardware architectures for deploying DNNs on edge devices have been actively studied. Emerging memory technologies with their better scalability, non-volatility, and good read performance are ideal candidates for DNNs which are trained once and deployed over many devices. Emerging memories have also been used in DNNs accelerators for efficient computations of dot-product. However, due to immature manufacturing and limited cell endurance, emerging resistive memories often result in reliability issues like stuck-at faults, which reduce the chip yield and pose a challenge to the accuracy of DNNs. Depending on the state, stuck-at faults may or may not cause error. Fault-tolerance of DNNs can be enhanced by reducing the impact of errors resulting from the stuck-at faults. In this work, we introduce simple and light-weight Intra-block Address remapping and weight encoding techniques to improve the fault-tolerance for DNNs. The proposed schemes effectively work at the network deployment time while preserving the network organization and the original values of the parameters. Experimental results on state-of-the-art DNN models indicate that, with a small storage overhead of just 0.98%, the proposed techniques achieve up to 300× stuck-at faults tolerance capability on Cifar10 dataset and 125× on Imagenet datatset, compared to the baseline DNNs without any fault-tolerance method. By integrating with the existing schemes, the proposed schemes can further enhance the fault resilience of DNNs.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.