Abstract

With the increasing interest in space science exploration, the number of spacecraft in Earth’s orbit has been steadily increasing. To ensure the safety and operational integrity of active satellites, advanced surveillance and early warning of unknown space objects such as space debris are crucial. The traditional threshold-based filter for space object detection heavily relies on manual settings, leading to limitations such as poor flexibility, high false alarm rates, and weak target detection capability in low signal-to-noise ratios. Therefore, detecting faint and small objects against a complex starry background remains a formidable challenge. To address this challenge, we propose a novel, intelligent, and accurate detection method called You Only Look Once for Space Object Detection (SOD-YOLO). Our method includes the following novel modules: Multi-Channel Histogram Truncation (MHT) enhances feature representation, CD-ELAN based on Central Differential Convolution (CDC) facilitates learning contrast information, the Space-to-Depth (SPD) module replaces pooling layer to prevent small object feature loss, a simple and parameter-free attention module (SimAM) expands receptive field for Global Contextual Information, and Alpha-EIoU optimizes the loss function for efficient training. Experiments on our SSOD dataset show SOD-YOLO has the ability to detect objects with a minimum signal-to-noise ratio of 2.08, improves AP by 11.2% compared to YOLOv7, and enhances detection speed by 42.7%. Evaluation on the Spot the Geosynchronous Orbit Satellites (SpotGEO) dataset demonstrates SOD-YOLO’s comparable performance to state-of-the-art methods, affirming its generalization and precision.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.