Abstract
In recent years, learning-based image hashing techniques have elicited wide interest among researchers because they can be applied in high-dimensional data such as videos and images. Supervised hashing techniques can achieve a satisfactory retrieval performance, but are excessively reliant on label information, resulting in the appearance of unsupervised hashing methods. However, existing unsupervised hashing methods often fail to obtain effective similarity information from the training set; this causes a large degradation in their retrieval performance. Some pseudo label-based methods partially alleviate this problem, but are sensitive to the pre-defined number of categories. Therefore, in this paper, we mainly discuss a solution to the above-mentioned problems regarding image hashing. We propose a sparse graph based self-supervised hashing method in which a sparse graph is constructed, which not only circumvents the requirement of a predefined number of categories as compared with pseudo label-based methods, but also significantly reduces the memory demand compared with a dense graph-based method. In addition, we also exploit a self-supervised reconstruction constraint to further preserve the semantic information. These items are combined in a linear manner and optimized using an iterative strategy. Four representative datasets, including two single-label and two multi-label datasets, are employed to evaluate our method. The results, based on multiple metrics, show that our method can outperform other state-of-the-art methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.