Abstract

This paper introduces an adaptive Convolutional Neural Network (CNN)-based Unscented Kalman Filter for the pose estimation of uncooperative spacecraft. The validation is carried out at Stanford’s robotic Testbed for Rendezvous and Optical Navigation on the Satellite Hardware-In-the-loop Rendezvous Trajectories (SHIRT) dataset, which simulates vision-based rendezvous trajectories of a servicer spacecraft to PRISMA’s Tango spacecraft. The proposed navigation system is stress-tested on synthetic as well as realistic lab imagery by simulating space-like illumination conditions on-ground. The validation is performed at different levels of the navigation system by first training and testing the adopted CNN on SPEED+, Stanford’s spacecraft pose estimation dataset with specific emphasis on domain shift between a synthetic domain and an Hardware-In-the-Loop domain. A novel data augmentation scheme based on light randomization is proposed to improve the CNN robustness under adverse viewing conditions, reaching centimeter-level and 10 degree-level pose errors in 80% of the SPEED+ lab images. Next, the entire navigation system is tested on the SHIRT dataset. Results indicate that the inclusion of a new scheme to adaptively scale the heatmaps-based measurement error covariance based on filter innovations improves filter robustness by returning centimeter-level position errors and moderate attitude accuracies, suggesting that a proper representation of the measurements uncertainty combined with an adaptive measurement error covariance is key in improving the navigation robustness.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.