Abstract

With the continuous maturity of remote sensing technology, the obtained remote sensing images' quality and quantity have surpassed any previous period. In this context, the content-based remote sensing image retrieval (CBRSIR) task attracts a lot of attention and research interest. Nowadays, the previous CBRSIR works mainly face the following problems. First of all, few works can realize one to many cross-modal image retrieval task (such as using optical image to retrieve SAR, optical images at the same time); secondly, researches mainly focus on small-area, target-level retrieval, and few on semantic-level retrieval of the whole image; last but not the least, most of the existing networks are characterized by massive parameters and huge computing need, which cannot be applied to resource-constrained edge devices with power and storage limit. For the sake of alleviating these bottlenecks, the paper introduces a novel light-weighted non-local semantic fusion network based on hypergraph structure for CBRSIR (abbreviated as HGNLSF-Net). Specifically, in the framework, using the topological characteristics of hypergraph, the relationship among multiple nodes can be modeled, so as to understand the global features on remote sensing images better with fewer parameters and less computation. In addition, since the non-local semantics often involves a lot of noise, the hard-link module is constructed to filter noise. A series of experimental results on typical CBRSIR dataset, i.e., MMRSIRD well show that with fewer parameters, the proposed HGNLSF-Net outperforms other methods and achieves optimal retrieval performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call