Abstract
Few-shot anomaly detection (FSAD) denotes the identification of anomalies within a target category with a limited number of normal samples. Existing FSAD methods largely rely on pretrained feature representations to detect anomalies, but the inherent domain gap between pretrained representations and target FSAD scenarios is often overlooked. This study proposes a prototypical learning-guided context-aware segmentation network (PCSNet) to address the domain gap, thereby improving feature descriptiveness in target scenarios and enhancing FSAD performance. In particular, PCSNet comprises a prototypical feature adaption (PFA) subnetwork and a context-aware segmentation (CAS) subnetwork. PFA extracts prototypical features as guidance to ensure better feature compactness for normal data while distinct separation from anomalies. A pixel-level disparity classification (PDC) loss is also designed to make subtle anomalies more distinguishable. Then a CAS subnetwork is introduced for pixel-level anomaly localization, where pseudo anomalies are exploited to facilitate the training process. Experimental results on MVTec AD and metal part defect detection (MPDD) demonstrate the superior FSAD performance of PCSNet, with 94.9% and 80.2% image-level area under the receiver operating characteristics (AUROCs) in an eight-shot scenario, respectively. Real-world applications on automotive plastic part inspection further demonstrate that PCSNet can achieve promising results with limited training samples. The code is available at https://github.com/yuxin-jiang/PCSNet.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE transactions on neural networks and learning systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.