Abstract

The YOLO (You Only Look Once) series is renowned for its real-time object detection capabilities in images and videos. It is highly relevant in industries like textiles, where speed and accuracy are critical. In the textile industry, accurate fabric type detection and classification are essential for improving quality control, optimizing inventory management, and enhancing customer satisfaction. This paper proposes a new approach using the YOLOv10 model, which offers enhanced detection accuracy, processing speed, and detection on the torn path of each type of fabric. We developed and utilized a specialized, annotated dataset featuring diverse textile samples, including cotton, hanbok, cotton yarn-dyed, and cotton blend plain fabrics, to detect the torn path in fabric. The YOLOv10 model was selected for its superior performance, leveraging advancements in deep learning architecture and applying data augmentation techniques to improve adaptability and generalization to the various textile patterns and textures. Through comprehensive experiments, we demonstrate the effectiveness of YOLOv10, which achieved an accuracy of 85.6% and outperformed previous YOLO variants in both precision and processing speed. Specifically, YOLOv10 showed a 2.4% improvement over YOLOv9, 1.8% over YOLOv8, 6.8% over YOLOv7, 5.6% over YOLOv6, and 6.2% over YOLOv5. These results underscore the significant potential of YOLOv10 in automating fabric detection processes, thereby enhancing operational efficiency and productivity in textile manufacturing and retail.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.