Abstract
Deploying deep convolutional neural networks on mobile devices is challenging because of the conflict between their heavy computational overhead and the hardware’s restricted computing capacity. Network quantization is typically used to alleviate this problem. However, we found that a “datatype mismatch” issue in existing low bitwidth quantization approaches can generate severe instruction redundancy, dramatically reducing their running efficiency on mobile devices. We therefore propose a novel quantization approach which ensures that only integer-based arithmetic is needed during the inference stage of the quantized model. To this end, we improved the quantization function to compel the quantized value to follow a standard integer format. Then we presented to simultaneously quantize the batch normalization parameters by a logarithm-like method. By doing so, the quantized model can keep the advantage of low bitwidth representation, while preventing the occurrence of “datatype mismatch” issue and corresponding instruction redundancy. Comprehensive experiments show that our method can achieve comparable prediction accuracy to other state-of-the-art methods while reducing the run-time latency by a large margin. Our fully integer-based quantized Resnet-18 has 4-bit weights, 4-bit activations and only a 0.7% top-1 and 0.4% top-5 accuracy drop on the ImageNet dataset. The assembly language implementation of a series of building blockscan reach a maximum of 4.33× the speed of the original full-precision version on an ARMv8 CPU.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.