Abstract
ABSTRACT In order to improve synthetic aperture radar (SAR) target recognition performance, this paper proposes a novel method using multi-level deep features. The multi-level deep features are learned by the convolutional neural network (CNN), which are capable of describing the target characteristics from different aspects. In order to make full use of the discrimination contained in the multi-level deep features, the joint sparse representation (JSR) is used as the basic classifier, which performs the multi-task learning to jointly classify the multi-level deep features. It could not only represent each feature properly but also consider the correlations between different levels of features. Based on the solutions from JSR, the target label is classified as the training class with the minimum reconstruction error. By fully exploiting the discriminative information contained in the multi-level deep features, the proposed method could effectively enhance SAR target recognition performance. The moving and stationary target acquisition and recognition (MSTAR) dataset is employed in the experiments. The results show that the proposed method could achieve a significantly high recognition rate of 99.38% for classifying 10 classes of targets under the standard operating condition (SOC), which is higher than those from some reference methods drawn from current literatures. Under different types of extended operating conditions (EOCs), the overall performance of the proposed method keeps superior over the reference methods. In addition, the outlier rejection capability of the proposed method is also better than the compared methods. All these experimental results validate the high effectiveness of the proposed method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.