Abstract

Machine learning methods for synthetic aperture radar (SAR) image automatic target recognition (ATR) can be divided into two main types: traditional methods and deep learning methods. The deep learning methods can learn the high-dimensional features of the target directly, and usually obtain high target recognition accuracy. However, they lack full consideration of SAR targets’ inherent characteristics resulting in poor generalization and interpretation ability. Compared with the deep learning methods, traditional methods can get more interpretable and stable results with model-based features. In order to take full advantage of these two kinds of methods, we propose target part attention network based on the attributed scattering center (ASC) model to integrate the electromagnetic characteristics with the deep learning framework. Firstly, considering the importance of scattering structure for SAR ATR, we design a target part model based on ASC model. Then, a novel part attention module based on Scaled Dot-Product Attention mechanism is proposed, which directly associates the features of target parts with the classification results. Finally, we give the derivation method of the importance of each part, which is of great significance for practical application and the interpretation of SAR ATR. Experiments on the MSTAR data set demonstrate the effectiveness of the proposed part attention network. Compared with existing studies, it can achieve higher and more robust classification accuracy under different complex conditions. Furthermore, combined with the importance of parts, we constructed two effective interpretable analysis methods for deep learning network classification results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.