Abstract

Product image search aims to retrieve similar product images based on a query image. While deep learning based features work well in retrieving images of the same category (e.g. “searching for T-shirts from all the clothing images”), they perform poorly when retrieving variants of images within the same category (e.g. “searching for uniform of Chelsea football club from all T-shirts image”), since it requires fine grained matching on image details. In this paper, we present a spatial quantization approach that utilizes spatial pyramid pooling (SPP) and vector of locally aggregated descriptors (VLAD) to extract more discriminative features for style-aware product search. By using the proposed spatial quantization, spatial information is encoded into the image feature to improve the fine grained product image search. Finally, the experiments on a large scale real world dataset provided by Alibaba large-scale image search challenge (ALISC) demonstrate the effectiveness of our method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call