Abstract

Common fabric image retrieval methods ignore the diversity and dynamism of user demands, the results are determined by the query image and cannot be dynamically adjusted. To solve this problem, this study proposes a novel image retrieval method for plaid fabrics based on hand-crafted features and relevant feedback. First, local texture descriptors are extracted by the local binary pattern on the separated images which are processed by Fourier transform. Global texture descriptors are extracted by scale-invariant feature transform (SIFT) and vector of locally aggregated descriptors (VLAD). Second, color moments with image partitioning are extracted to characterize spatial color information of plaid fabric images. Third, the extracted features are fused by the weight allocation for similarity measurement. Finally, the relevant feedback based on meta learning is involved to realize personalized adjustment and optimization of retrieval results. An image retrieval database is built as the benchmark by collecting over 44, 000 plaid fabric samples from the factory. Experiments show that precision and recall at rank eight reach to 70.6% and 62.6%, respectively, and mAP reaches to 0.690. Results prove that the proposed strategy is feasible and effective, which can realize plaid fabric image retrieval fast and efficiently.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.