Abstract

With the prevalent use of mobile cameras to capture images, the demands for efficient and effective methods for indexing and retrieval of personal image collections on mobile devices have also risen. In this paper, we propose to represent images with hash codes, which is a compressed representation of deep convolutional features using deep auto-encoder on the cloud. To ensure user's privacy, the image is first encrypted using a light-weight encryption algorithm on mobile device prior to offloading it to the cloud for features extraction. This approach eliminates the computationally expensive process of features extraction on resource constrained devices. A pre-trained convolutional neural network (CNN) is used to extract features which are then transformed to compact binary codes using a deep auto-encoder. The hash codes are then sent back to the mobile device where they are stored in a hash table along with image location. Approximate nearest neighbor (ANN) search approach is utilized to efficiently retrieve the desired images without exhaustive searching of the entire image collection. The proposed method is evaluated against three different publicly available image datasets namely Corel-10K, GHIM-10K, and Product image dataset. Experimental results demonstrate that features representation using CNN and auto-encoder shows much better results than several state-of-the-art hashing schemes for image retrieval on mobile devices.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call