Abstract

This article proposes an algorithm to more efficiently search for clothing product images that are similar to a new input clothing product image. Convolutional Neural Network (CNN) and Artificial Neural Network Oh Yeah (ANNOY) technologies were applied to a database of 60,000 clothing images, and the similarity and the processing rates of the two technologies were compared. The conventional CNN technology searches similar images by exploring all the pixels of an image, while the ANNOY technology uses a binary tree node, which is the similarity distance measured between images. The ANNOY technology can drastically reduce image search time, although the image similarity accuracy is slightly decreased. The reduction in image search time saves costs, and the rapid search processing rate enables the technology to be applied to various kinds of online services, including product search, product comparison and product recommendation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call