Abstract

Defects on the surface of yarn-dyed fabrics are one of the important factors affecting the quality of fabrics. Defect detection is the core link of quality control. Due to the diversity of yarn-dyed fabric patterns and the scarcity of defect samples, reconstruction-based unsupervised deep learning algorithms have received extensive attention in the field of fabric defect detection. However, most existing deep learning algorithms cannot fully extract shallow, high-frequency and high-level information, which limits their ability to reconstruct yarn-dyed fabric images. In this article, we propose an Attention-based Feature Fusion Generative Adversarial Network framework for unsupervised defect detection of yarn-dyed fabrics. The framework utilizes a modified Feature Pyramid Network to fuse multi-level information and utilizes an attention mechanism to enhance the model's feature representation capabilities. The Attention-based Feature Fusion Generative Adversarial Network consists of an attention fusion generator and a patch-level discriminator. In the attention fusion generator, the Feature Pyramid Network with EfficientNetV2 as the backbone is used as the core building block, and different feature fusion methods are used to avoid the loss of information in the process of network deepening. The attention mechanism is used to enhance the channel and spatial-wise correlation of features, which helps the model to focus on more meaningful information by recalibrating the feature maps. In the discriminator, the patch-level discriminator is used to calculate the similarity between the reconstructed image and the original image from a local perspective, thereby improving the model's attention to texture details. Experimental results on public datasets demonstrate the effectiveness of the proposed method compared to other methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.