Abstract

Image retrieval (IR) for practical remote sensing (RS) should have high accuracy, storage, and calculation efficiency, while not relying on big annotations. However, current supervised and unsupervised RS-IR methods do not yet fully meet these requirements. To this end, we propose a novel hashing-based IR approach via learning hash codes from open and representative self-supervised features. Specifically, we constructed a model out of a self-supervised pre-trained backbone and a small multi-layer perceptron (MLP)-based hashing learning neural network. Features from the frozen backbones were used to reconstruct a similarity matrix to guide the hash network learning. This way, semantic structure can be preserved. To enhance the proposed approach, we propose the exploitation of global high-level semantic information within the similarity reconstruction process by introducing a small set of labeled datasets. Extensive comparative experiments on two commonly used RS image datasets demonstrate the outperformance of our proposed approach, and its well balance between the retrieval accuracy and utilized annotations. In these two datasets, the labeled data required by our method accounts for less than 3% of that required by traditional methods, but our obtained mAP can reach over 90%, which is close to that of current advanced supervised methods. Additionally, we analyzed the specific effect of our design and the associated hyperparameters.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.