TDAD: Trident Distillations for Anomaly Detection

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon

The problem of overgeneralization is widespread in unsupervised anomaly detection methods, especially those that rely on knowledge distillation techniques. This problem arises because the student network has a strong tendency to mimic its teacher, even for unseen anomaly patterns, resulting in erroneous predictions. To tackle this issue, we have developed Trident Distillation Anomaly Detection (TDAD), which uses a trident distillation process in a self-supervised masked training paradigm. TDAD incorporates synthetic anomalies and seamlessly blends general knowledge distillation (GKD) with novel self-consistency distillation (SCD) and discrepancy maximization distillation (DMD) techniques. The synergistic optimization of these components widens the gap between abnormal feature distributions in the teacher and student domains, while maintaining coherence within the normal distributions, thereby enhancing prediction reliability. Extensive experiments conducted on the MVTec dataset demonstrate that TDAD effectively mitigates overgeneralization, achieving superior anomaly detection performance compared to its competitors.

Save Icon
Up Arrow
Open/Close
Notes

Save Important notes in documents

Highlight text to save as a note, or write notes directly

You can also access these Documents in Paperpal, our AI writing tool

Powered by our AI Writing Assistant