Abstract

Through truncating the weights and activations of a deep neural network, conventional binary quantization imposes limitations on the representation capability of the network parameters, which hence deteriorates the detection performance of the network. In this paper, a joint-guided distillation binary neural network via dynamic channel-wise diversity enhancement for object detection (JDBNet) is proposed to mitigate the gap of quantization errors. Our JDBNet includes a dynamic channel-wise diversity scheme and real-valued joint-guided teacher assistance to enhance the representation capability of the binary neural network in the object detection tasks. In the dynamic diversity scheme, the learning channel-wise bias (LCB) layer supports adjusting the magnitude of the parameters in which the sensitivity of the model parameters to the arbitrary quantization method is reduced, thereby improving the diversity expression ability of the feature parameters. In the joint-guided strategy, the single-precision implicit knowledge from the guiding teacher in the multilevel layer is utilized to supervise and penalize the quantitative model, enhancing the fitting performance of parameters in the binary quantized model. Extensive experiments on the PASCAL VOC, MS COCO, and VisDrone-DET datasets demonstrate that our JDBNet outperforms the state-of-the-art binary object detection networks in terms of mean Average Precision.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.