Abstract

The classification of high-resolution (HR) synthetic aperture radar (SAR) images is of great importance for SAR scene interpretation and application. However, the presence of intricate spatial structural patterns and complex statistical nature makes SAR image classification a challenging task, especially in the case of limited labeled SAR data. This paper proposes a novel HR SAR image classification method, using a multi-scale deep feature fusion network and covariance pooling manifold network (MFFN-CPMN). MFFN-CPMN combines the advantages of local spatial features and global statistical properties and considers the multi-feature information fusion of SAR images in representation learning. First, we propose a Gabor-filtering-based multi-scale feature fusion network (MFFN) to capture the spatial pattern and get the discriminative features of SAR images. The MFFN belongs to a deep convolutional neural network (CNN). To make full use of a large amount of unlabeled data, the weights of each layer of MFFN are optimized by unsupervised denoising dual-sparse encoder. Moreover, the feature fusion strategy in MFFN can effectively exploit the complementary information between different levels and different scales. Second, we utilize a covariance pooling manifold network to extract further the global second-order statistics of SAR images over the fusional feature maps. Finally, the obtained covariance descriptor is more distinct for various land covers. Experimental results on four HR SAR images demonstrate the effectiveness of the proposed method and achieve promising results over other related algorithms.

Highlights

  • Remote Sensing Image Processing and Fusion Group, School of Electronic Engineering, Xidian University, National Laboratory of Radar Signal Processing, Xidian University, Xi’an 710071, China; Abstract: The classification of high-resolution (HR) synthetic aperture radar (SAR) images is of great importance for SAR scene interpretation and application

  • To tackle the above problems, we propose a novel HR SAR image classification method, using a multi-scale deep feature fusion network and covariance pooling manifold network (MFFN-CPMN)

  • In the multi-scale feature fusion network (MFFN)- CPMN, deep data features and global statistical properties of SAR image are jointly considered in the representation learning

Read more

Summary

Introduction

Remote Sensing Image Processing and Fusion Group, School of Electronic Engineering, Xidian University, National Laboratory of Radar Signal Processing, Xidian University, Xi’an 710071, China; Abstract: The classification of high-resolution (HR) synthetic aperture radar (SAR) images is of great importance for SAR scene interpretation and application. This paper proposes a novel HR SAR image classification method, using a multi-scale deep feature fusion network and covariance pooling manifold network (MFFN-CPMN). We propose a Gabor-filtering-based multi-scale feature fusion network (MFFN) to capture the spatial pattern and get the discriminative features of SAR images. We utilize a covariance pooling manifold network to extract further the global second-order statistics of SAR images over the fusional feature maps. The new generation of space- or airborne SAR sensors can acquire large amounts of high-resolution (HR) SAR images [2]. These data provide sufficient information in the spatial context for SAR scene understanding and interpretation. HR SAR images contain more strong scattering points, and the arrangements of numerous and various objects have become

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call