Abstract

Synthetic aperture radar (SAR) images have special physical scattering characteristics owing to their unique imaging mechanism. Traditional deep learning algorithms usually extract features from real-valued SAR images in a purely data-driven manner, which may ignore some important physical scattering characteristics and sacrifice some useful target information in SAR images. This undoubtedly limits the improvement in performance for SAR target recognition. To take full advantage of the physical information contained in SAR images, a complex-valued network guided with sub-aperture decomposition (CGS-Net) for SAR target recognition is proposed. According to the fact that different targets have different physical scattering characteristics at different angles, the sub-aperture decomposition is used to improve accuracy with a multi-task learning strategy. Specifically, the proposed method includes main and auxiliary tasks, which can improve the performance of the main task by learning and sharing useful information from the auxiliary task. Here, the main task is the target recognition task, and the auxiliary task is the target reconstruction task. In addition, a complex-valued network is used to extract the features from the original complex-valued SAR images, which effectively utilizes the amplitude and phase information in SAR images. The experimental results obtained using the MSTAR dataset illustrate that the proposed CGS-Net achieved an accuracy of 99.59% (without transfer learning or data augmentation) for the ten-classes targets, which is superior to the other popular deep learning methods. Moreover, the proposed method has a lightweight network structure, which is suitable for SAR target recognition tasks because SAR images usually lack a large number of labeled data. Here, the experimental results obtained using the small dataset further demonstrate the excellent performance of the proposed CGS-Net.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.