Abstract
Data-free quantization aims to achieve model quantization without accessing any authentic sample. It is significant in an application-oriented context involving data privacy. Converting noise vectors into synthetic samples through a generator is a popular data-free quantization method, which is called generative data-free quantization. However, there is a difference in attention map between synthetic samples and authentic samples. This is always ignored and restricts the quantization performance. First, since synthetic samples of the same class are prone to have homogenous attention, the quantized network can only learn limited intra-class visual features. Second, synthetic samples in eval mode and training mode exhibit different attention. Hence, the statistical distribution matching tends to be inaccurate. ACQ is proposed in this paper to fix the attention of synthetic samples. Regarding intra-class attention homogeneity, we introduce an attention center matching loss aimed at achieving coarse-grained matching of attention of synthetic samples. Additionally, we have devised an adversarial loss based on pairs of samples with identical conditions. On one hand, this mechanism prevents the generator from mode collapse due to excessive attention on conditional information. On the other hand, it augments the separation between intra-class samples, thus further enhancing intra-class attention diversity. To improve the attention similarity of synthetic samples in different network modes, we introduce a consistency penalty to guarantee accurate statistical distribution matching. The experimental results demonstrate that ACQ effectively improves the attention problems of synthetic samples. Under various training settings, ACQ achieves the best quantization performance. For the 4-bit quantization of Resnet18 and Resnet50, ACQ reaches 67.55 % and 72.23 % accuracy, respectively.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.