Abstract
Anomaly detection poses a significant challenge in the industry and knowledge distillation constructed using a frozen teacher network and a trainable student network is the prevailing approach for detecting suspicious regions. Forward and reverse distillation are the main ways to achieve anomaly detection. To design an effective model and aggregate detection results, we propose a dual-student knowledge distillation (DSKD) based on forward and reverse distillation. Taking advantage of the priority of reverse distillation to obtain high-level representation, we combine a skip connection and an attention module to build a reverse distillation student network that simultaneously focuses on high-level representation and low-level features. DSKD uses a forward distillation network as an auxiliary to allow the student network to preferentially obtain the query image. For different anomaly score maps obtained by the dual-student network, we use synthetic noise enhancement in combination with image segmentation loss to adaptively learn the weight scores of individual maps. Empirical experiments conducted on the MVTec dataset show that the proposed DSKD method achieves good performance on texture images as well as competitive results on object images compared with other state-of-the-art methods. Meanwhile, ablation experiments and a visualization analysis validate the contributions of each of the model’s components.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.