Abstract

With the rapid development of space exploration worldwide, there is a sudden increase in the type and number of spacecraft, thus leading to a more complex space environment. To enhance the ability of space situational awareness, the most important step is to effectively recognize space targets of interests from various spacecraft and debris. Traditional space target recognition approaches adopt manual feature extraction with limited data, resulting in a semantic gap between low-level visual features and high-level semantic representation. Although deep learning models alleviate this problem with a unified framework for combined learning feature extraction and classification simultaneously, it is easy to overfit and leads to poor generalization results when faced with a situation of small examples. To address these issues, we present an end-to-end few-shot deep learning framework for space target recognition, i.e., discriminative deep nearest neighbor neural network (D2N4). Our D2N4 aims to improve the discriminability of the deeply learned features with mainly two strategies. On the one hand, we add an intraclass compactness principle by introducing center loss to efficiently pull deep features of the same classes to their centers and, thus overcoming significant intraclass variation of space target. On the other hand, we introduce the global pooling information for each deep local descriptor to reduce interference from local background noise, thus enhancing the model robustness. In practice, under the joint supervision of soft-max loss and center loss, the deep embedding module and image-to-class metric module are trained in an end-to-end way. Extensive experiments on the space target data set BUAA-SID-share1.0 demonstrate that our simple and effective approach outperforms previous space target recognition methods and is more efficient than recent few-shot approaches. In addition, the proposed framework is equally applicable to natural images and achieves state-of-the-art performance on data sets CUB-200-2010, Stanford Dogs, and Stanford Cars.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.