Abstract
PurposeExisting clothing parsing methods make little use of dataset-level information. This paper aims to propose a novel clothing parsing method which utilizes higher-level outfit combinatorial consistency knowledge from the whole clothing dataset to improve the accuracy of segmenting clothing images.Design/methodology/approachIn this paper, the authors propose an Outfit Memory Net (OMNet) that augments original feature by aggregating dataset-level prior clothing combination information. Specifically, the authors design an Outfit Matrix (OM) to represent clothing combination information of single image and an Outfit Memory Module (OMM) to store the clothing combination information of all images in the training set, i.e. dataset-level clothing combination information. In addition, the authors propose a Multi-scale Aggregation Module (MAM) to aggregate the clothing combination information in a multi-scale manner to solve the problem of large variance in the scale of objects in the clothing images.FindingsExperiments on Colorful Fashion Parsing Dataset (CFPD) dataset show that the authors' method achieves 93.15% pixel accuracy (PA) and 51.24% mean of class-wise intersection over union (mIoU), which are satisfactory parsing results compared with existing methods such as PSPNet, DANet and DeepLabV3. Moreover, through comparing the segmentation accuracy of different methods for each category, MAM could effectively improve the segmentation of small objects.Originality/valueWith the rise of various online shopping platforms and the continuous development of deep learning technology, emerging applications such as clothing recommendation, matching, classification and virtual try-on system have emerged in the clothing field. Clothing parsing is the key technology to realize these applications. Therefore, improving the accuracy of clothing parsing is necessary.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Clothing Science and Technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.