Abstract

Mixed-signal in-memory computation can drastically improve the efficiency of the hardware implementing machine learning (ML) algorithms by (i) removing the need to fetch neural network parameters from internal or external memory and (ii) performing a large number of multiply-accumulate operations in parallel. However, this boost in efficiency comes with some disadvantages. Among them, the inability to precisely program nonvolatile memory devices (NVM) with neural network parameters and sensitivity to noise prevent the mixed-signal hardware to perform a precise and deterministic computation. Unfortunately, these hardware-specific errors can get magnified while propagating along with the layers of the deep neural network. In this paper, we show that the inability to implement parameters of the already trained network with enough precision can completely stop the network from performing any meaningful operation. However, even at this level of degradation, the feature extractor section of the network still extracts enough information from which an acceptable level of performance can be achieved by just retraining the last classification layers of the network. Our results suggest that instead of just blindly trying to implement software algorithms in hardware as precisely as possible, it might be more efficient to implement neural networks with imperfect devices and circuits and let the network itself compensate for these imprecise computations by only retraining few layers.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.