Abstract

Remote sensing scene classification (RSSC) is a fundamental but challenging task in the discipline of remote sensing (RS). Significant recent advancements have been achieved in RSSC utilizing convolution neural network. The increasing amount of RS images has resulted in two tough problems: intra-class diversity and inter-class similarity. Existing approaches cannot effectively concentrate on both global and local key features simultaneously. Therefore, in this letter, we propose a dual-branch global-local attention network (DBGA-Net) to solve this problem. Specifically, effective complementary features are first extracted from two distinct models. Following that, it was designed that global-local attention module, which concurrently focuses on both the entire image global features and local pertinent parts within it, would be necessary to successfully extract valid information from RS images. In the end, the fusion module efficiently combines the extracted complementary features to yield more comprehensive scene information. To verify the effectiveness of DBGA-Net, experiments are conducted on three scene classification datasets, 99.86%, 97.60% and 96.06% experimental results are obtained, respectively, with significant classification performance of the proposed method compared with the state-of-the-art (SOTA) method. Our code and images are available at https://github.com/ZhouYao1020/DBGANet.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call