Abstract

Visual features and representation learning strategies experienced huge advances in the previous decade, mainly supported by deep learning approaches. However, retrieval tasks are still performed mainly based on traditional pairwise dissimilarity measures, while the learned representations lie on high dimensional manifolds. With the aim of going beyond pairwise analysis, post-processing methods have been proposed to replace pairwise measures by globally defined measures, capable of analyzing collections in terms of the underlying data manifold. The most representative approaches are diffusion and ranked-based methods. While the diffusion approaches can be computationally expensive, the rank-based methods lack theoretical background. In this paper, we propose an efficient Rank-based Diffusion Process which combines both approaches and avoids the drawbacks of each one. The obtained method is capable of efficiently approximating a diffusion process by exploiting rank-based information, while assuring its convergence. The algorithm exhibits very low asymptotic complexity and can be computed regionally, being suitable to outside of dataset queries. An experimental evaluation conducted for image retrieval and person re-ID tasks on diverse datasets demonstrates the effectiveness of the proposed approach with results comparable to the state-of-the-art.

Highlights

  • The evolution of image retrieval approaches was mainly supported by the development of novel features for representing the visual content

  • The presentation of our method is organized in four main steps: (i) a similarity measure is defined based on ranking information; (ii) a normalization is conducted for improving the symmetry of ranking references; (iii) the rank diffusion process is performed, requiring a small number of iterations; (iv) a post-diffusion step is conducted for exploiting the reciprocal rank information

  • The result of the rank diffusion process given by matrix P(θ ) is subsequently column normalized according to Equation (4)

Read more

Summary

Introduction

The evolution of image retrieval approaches was mainly supported by the development of novel features for representing the visual content. While rank-based approaches focus on the similarity encoded in the top positions of ranked lists, reducing the computational cost, the diffusion approaches benefit from a strong mathematical background In this scenario, the Rank Diffusion Process with Assured Convergence (RDPAC) is proposed in this paper based on an efficient formulation capable of avoiding the computation of small and irrelevant similarity values. The main idea consists of exploiting the rank information to identify and index the high similarity values in the transition and affinity matrices In this way, the method admits an efficient algorithmic solution capable of computing an effective approximation of diffusion processes. This makes it robust to feature variations, and suitable for fusion tasks

Rank Diffusion Process with Assured Convergence
Rank Similarity Measure
Pre-Diffusion Rank Normalization
Rank Diffusion with Assured Convergence
Proof of Convergence
Post-Diffusion Reciprocal Analysis
Rank Fusion
Efficiency and Complexity Aspects
Efficient Algorithmic Solutions
Complexity Analysis
Regional Diffusion for Unseen Queries
Experimental Protocol and Implementation Aspects
Parametric Space Analysis
General Image Retrieval Results
Method
Person Re-ID Results
Visual Analysis
Efficiency Evaluation
Comparison with Other Approaches
Findings
Conclusions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.