Abstract

This paper contributes to the goal of finding an efficient compression solution for post-training quantization from the perspective of support region choice under the framework of low-bit uniform quantization. The decision to give preference to uniform quantization comes from the fact that support region choice is the most sensitive in the uniform quantization of nonuniform sources (e.g., Laplacian sources). Therefore, in this paper, we analyse in detail how the choice of the support region influences the performance of two-bits uniform quantization, measured with signal to quantization noise ratio (SQNR), and the accuracy of the compressed neural network (NN) model. We provide experimental and theoretical results for a few significant cases of two-bits uniform quantizer design, where we assume that Laplacian source models the distribution of weights in our fully connected NN. We opt for Laplacian distribution since it models well weights of NNs. Specifically, we analyse whether it is possible to apply the simplest uniform quantization in trained NN model weight representation with a bit rate of R = 2 bit/sample while preserving the accuracy of the model to a great extent. Also, our goal is to determine whether the choice of the key parameter of two-bits uniform quantizer (support region threshold) equally reflects on both, SQNR and accuracy. Moreover, we extend our analysis to the application of layer-wise two-bits uniform quantization in order to examine whether it is possible to achieve an additional improvement of the accuracy of our NN model for the MNIST dataset. We believe that the detailed analysis of post-training quantization described and conducted in this paper is very useful for all further research studies of this very current topic, especially due to the fact that the problem regarding post-training quantization is addressed from a particularly important perspective of choosing the support region.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call