Abstract
Due to the intricate and diverse nature of textile defects, detecting them poses an exceptionally challenging task. In comparison with conventional defect detection methods, deep learning-based defect detection methods generally exhibit superior precision. However, utilizing deep learning for defect detection requires a substantial volume of training data, which can be particularly challenging to accumulate for textile flaws. To augment the fabric defect dataset and enhance fabric defect detection accuracy, we propose a fabric defect image generation method based on Pix2Pix generative adversarial network. This approach devises a novel dual-stage W-net generative adversarial network. By increasing the network depth, this model can effectively extract intricate textile image features, thereby enhancing its ability to expand information sharing capacity. The dual-stage W-net generative adversarial network allows generating desired defects on defect-free textile images. We conduct quality assessment of the generated fabric defect images resulting in peak signal-to-noise ratio and structural similarity values exceeding 30 and 0.930, respectively, and a learned perceptual image patch similarity value no greater than 0.085, demonstrating the effectiveness of fabric defect data augmentation. The effectiveness of dual-stage W-net generative adversarial network is established through multiple comparative experiments evaluating the generated images. By examining the detection performance before and after data augmentation, the results demonstrate that mean average precision improves by 6.13% and 14.57% on YOLO V5 and faster recurrent convolutional neural networks detection models, respectively.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.