Abstract
Deep neural networks (DNNs) have obtained remarkable achievements in various vision tasks. However, DNNs’ mechanism remains obscure, especially in synthetic aperture radar (SAR) target recognition. To clearly understand DNNs’ mechanism, some interpretation algorithms are proposed to provide a saliency map to visualize DNN’s decisions, which can be categorized into (1) propagation-based methods, (2) activation-based methods, and (3) perturbation-based methods. Unfortunately, each of them has some imperfection in producing saliency maps. The saliency maps produced by propagation-based methods are sensitive to SAR speckles. Activation-based methods have difficulty producing a saliency map where the highlighted region covers the entire feature. Perturbation-based methods tend to forfeit intricate details of the target in saliency map during optimization. To mitigate the above limitations, this paper proposed Adaptive Perturbation Interpretation (API), with two aims: (1) to improve the interpretability of SAR recognition networks and (2) to recover the fine-grained lost during the optimization procedures of perturbation-based methods. Specifically, we first introduce a Multi-information Perturbation Optimization Module (MPOM) to retain the target’s local characteristics while constraining saliency map’s sparsity by configuring regularization terms. Additionally, we further introduce a Feature Compensation Module (FCM) to recover the fine-grained feature lost during the optimization process in MPOM. Extensive experiments are implemented on SAR interpretation benchmark datasets, MSTAR and SARAIRcraft1.0, the results demonstrate the superiority of API to other state-of-the-art methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.