Abstract
Currently, deep neural networks have gained impressive performance for object detection when plenty of labeled training data are available. However, a substantial quantity of labeled data is typically rare and time-consuming to obtain in deep space exploration. To possibly alleviate this problem, we use domain adaptation (DA) to effectively detect unannotated real data samples in the crater detection problem with the help of autoannotated synthetic data samples only. Specifically, we present a novel network, namely, CraterDANet, which unifies image- and feature-level adversarial DAs to bridge the domain gap between synthetic and real data. To evaluate our method, we present a new lunar crater dataset that contains nearly 20 000 small-scale craters captured from different Lunar Reconnaissance Orbital (LRO) Narrow Angle Camera (NAC) grayscale images that have large variations in shape, size, overlap, degradation, and illumination conditions. To the best of our knowledge, this is the first lunar crater dataset with diversification of illumination conditions, which provides bounding box annotations. Through experiments, the proposed CraterDANet, which was trained on labeled synthetic data without using any additional real labeled data, achieved an <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$F1$ </tex-math></inline-formula> -score of 80.27%, which is slightly better than traditional supervised methods trained on real data. Although the performance still falls short of the state-of-the-art Faster R-convolutional neural network (CNN), it achieves an <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$F1$ </tex-math></inline-formula> -score of +5.98% performance improvement compared with Faster R-CNN trained on synthetic data only. This result demonstrates the effectiveness of the proposed method for reducing the domain gap between synthetic and real data.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.