Abstract

Hashing is an effective method to retrieve similar images from a large scale database. However, a single hash table requires searching an exponentially increasing number of hash buckets with large Hamming distance for a better recall rate which is time consuming. The union of results from multiple hash tables (multi-hashing) yields a high recall but low precision rate with exact hash code matching. Methods using image filtering to reduce dissimilar images rely on Hamming distance or hash code difference between query and candidate images. However, they treat all hash buckets to be equally important which is generally not true. Different buckets may return different number of images and yield different importance to the hashing results. We propose two descriptors, bucket sensitivity measure and location sensitivity measure, to score both the hash bucket and the candidate images that it contains using a location-based sensitivity measure. A radial basis function neural network (RBFNN) is trained to filter dissimilar images based on the Hamming distance, hash code difference, and the two proposed descriptors. Since the Hamming distance and the hash code difference are readily computed by all hashing-based image retrieval methods, and both the RBFNN and the two proposed sensitivity-based descriptors are computed offline when hash tables become available, the proposed sensitivity based image filtering method is efficient for a large scale image retrieval. Experimental results using four large scale databases show that the proposed method improves precision at the expense of a small drop in the recall rate for both data-dependent and data-independent multi-hashing methods as well as multi-hashing combining both types.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.