Abstract

With the advancement of image processing and computer vision technology, content-based product search is applied in a wide variety of common tasks, such as online shopping, automatic checkout systems, and intelligent logistics. Given a product image as a query, existing product search systems mainly perform the retrieval process using predefined databases with fixed product categories. However, real-world applications often require inserting new categories or updating existing products in the product database. When using existing product search methods, the image feature extraction models must be retrained and database indexes must be rebuilt to accommodate the updated data, and these operations incur high costs for data annotation and training time. To this end, we propose a few-shot incremental product search framework with meta-learning, which requires very few annotated images and has a reasonable training time. In particular, our framework contains a multipooling-based product feature extractor that learns a discriminative representation for each product, and we also design a meta-learning-based feature adapter to guarantee the robustness of the few-shot features. Furthermore, when expanding new categories in batches during a product search, we reconstruct the few-shot features by using an incremental weight combiner to accommodate the incremental search task. Through extensive experiments, we demonstrate that the proposed framework achieves excellent performance for new products while still guaranteeing the high search accuracy of the base categories after gradually expanding new product categories without forgetting.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.