Abstract

AbstractDue to the high‐risk working environment of high‐voltage transmission lines, defect samples of strain clamps cannot be fully and completely collected. As a result, the deep learning method based on defect sample tags cannot effectively identify all abnormalities. To solve this problem, an unsupervised anomaly detection method based on knowledge distillation is proposed, which only requires a small number of normal samples to drive the model for anomaly detection. ResNet is the framework of the teacherstudent model, and the feature activation layer after ResBlock is used for knowledge transfer. Residual‐assisted attention and pyramid‐splitting attention were used to enhance the spatial perception and multi‐scale information utilization ability of the model. This model only transmits the information of normal samples and is sensitive to abnormal samples. The proposed model outperformed the baseline by 23% and individual categories by 78% on the MVTec AD (Anomaly Detection Dataset) and outperformed the baseline by 45% and individual categories by 10% on the CIFAR10 and is also reliable for Mnist and Fashion Mnist. This method performs best (82.71%) over the existing method on the self‐built data set.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call