Abstract

This paper proposes an automatic target recognition (ATR) method for synthetic aperture radar (SAR) images based on information-decoupled representation. A typical SAR image of a ground target can be divided into three parts: target region, shadow and background. From the aspect of SAR target recognition, the target region and shadow contain discriminative information. However, they also include some confusing information because of the similarities of different targets. The background mainly contains redundant information, which has little contribution to the target recognition. Because the target segmentation may impair the discriminative information in the target region, the relatively simpler shadow segmentation is performed to separate the shadow region for information decoupling. Then, the information-decoupled representations are generated, i.e., the target image, shadow and original image. The background is retained in the target image, which represents the coupling of target backscattering and background. The original image and generated target image are classified using the sparse representation-based classification (SRC). Then, their classification results are combined by a score-level fusion for target recognition. The shadow image is not used because of its lower discriminability and possible segmentation errors. To evaluate the performance of the proposed method, extensive experiments are conducted on the Moving and Stationary Target Acquisition and Recognition (MSTAR) dataset under both standard operating condition (SOC) and various extended operating conditions (EOCs). The proposed method can correctly classify 10 classes of targets with the percentage of correct classification (PCC) of 94.88% under SOC. With the PCCs of 93.15% and 75.03% under configuration variance and 45° depression angle, respectively, the superiority of the proposed is demonstrated in comparison with other methods. The robustness of the proposed method to both uniform and nonuniform shadow segmentation errors is validated with the PCCs over 93%. Moreover, with the maximum average precision of 0.9580, the proposed method is more effective than the reference methods on outlier rejection.

Highlights

  • The interpretation of synthetic aperture radar (SAR) images has important meanings for both civilian and military applications [1,2,3,4,5]

  • This study focused on the automatic target recognition (ATR) of SAR images [1], which aims to determine the target type of an SAR image with unknown label by matching the information in the input SAR image with that in the training samples

  • An SAR ATR method is proposed to exploit the discriminative information in SAR images based on information-decoupled representations in this study

Read more

Summary

Introduction

The interpretation of synthetic aperture radar (SAR) images has important meanings for both civilian and military applications [1,2,3,4,5]. Because of the rich physically relevant descriptions, the attributed scattering centers have been effectively used for SAR ATR [25,26,27,28,29] Most of these features aim to reduce the redundancy in the original SAR images and can hardly reduce the confusing information. Under SOC, both target region and shadow contain more discriminability than confusion and tend to share similar similarity patterns over the training samples Their joint usage, i.e., the original image, is preferred to best embody the discriminative information and suppress the confusing information.

Information Model of SAR Image
Shadow SegmenBtaactkiognround
Initialization
Target Recognition via Score-Level Fusion
Depression Angle Variance
Results
Region Deformation

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.