Abstract

Reliable pose estimation for non-cooperative spacecraft is a key technology for in-orbit service and active debris removal missions. Utilizing deep learning techniques for processing monocular camera images is effective and is a hotspot of current research. To reduce errors and improve model generalization, researchers often design multi-head loss functions or use generative models to achieve complex data augmentation, which makes the task complex and time-consuming. We propose a pyramid vision transformer spatial-aware keypoints regression network and a stereo-aware augmentation strategy to achieve robust prediction. Specifically, we primarily use the eight vertices of a cuboid satellite body as landmarks and the observable surfaces can be transformed by, respectively, using the pose labels. The experimental results on the SPEED+ dataset show that by using the existing EPNP algorithm and pseudo-label self-training method, we can achieve high-precision pose estimation for target cross-domains. Compared to other existing methods, our model and strategy are more straightforward. The entire process does not require the generation of new images, which significantly reduces the storage requirements and time costs. Combined with a Kalman filter, the robust and continuous output of the target position and attitude is verified by the SHIRT dataset. This work realizes deployment on mobile devices and provides strong technical support for the application of an automatic visual navigation system in orbit.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.