Abstract

Deep learning has attracted intensive attention in synthetic aperture radar (SAR) automatic target recognition (ATR). Usually, a considerable number of labeled samples are necessary to learn a deep model for obtaining good generalization capability. However, the process of sample labeling is time-consuming and costly. This letter proposes an active self-paced deep learning (ASPDL) approach to SAR ATR. In a nutshell, we first introduce the Bayesian inference into the process of deep model parameter optimization, aiming at learning a robust classification model in the case of a limited number of labeled samples. Next, a cost-effective sample selection strategy is presented to iteratively and actively select the informative samples from a pool of unlabeled samples for labeling. Concretely, high-confidence samples are actively selected through self-paced learning (SPL) way and automatically pseudo-labeled with the current classification model, whereas low-confidence samples are chosen through an active learning strategy and manually labeled. Finally, we update the parameters of the model by minimizing a dual-loss function using a new training set that is constructed by incorporating new labeled samples with original ones. Experiments on the moving and stationary target acquisition and recognition (MSTAR) benchmark data demonstrate that the proposed method can achieve better classification accuracy with relatively few labeled samples compared with some state-of-the-art methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.