In this work, we propose an efficient image re-ranking method, without regional information of each indexed feature stored in the inverted file, to re-rank all retrieved images. The motivation of the proposed method is that, there are usually many visual words in the query image that only give votes to irrelevant images. With this observation, we propose to use visual words which are actually useful in finding relevant images to re-rank the retrieved images. To achieve this goal, we first initialize some images similar to the query by maximizing a quadratic function when giving an initial ranking of the retrieved images. The quadratic function is constructed by storing the similarities among a short-list of top ranked images into an affinity matrix, where the similarity between any two images is computed by the proposed graph diffusion. Then, we select a subset of visual words in the query with an alternating optimization strategy: (1) at each iteration, visual words are selected based on the set of similar images that we have found, (2) and in turn, the set of similar images is updated with the set of selected words. These two steps are repeated until convergence. Experimental results on standard benchmark datasets show that the proposed method achieves an order of magnitude speedups over the state-of-the-art spatial based re-ranking techniques, and obtains much better retrieval quality as well.
Read full abstract