Abstract

In the framework of autonomous spacecraft navigation, this manuscript proposes a novel vision-based terrain relative navigation (TRN) system called FederNet. The developed system exploits a pattern of observed craters to perform an absolute position measurement. The obtained measurements are thus integrated into a navigation filter to estimate the spacecraft state in terms of position and velocity. Recovering crater locations from elevation imagery is not an easy task since sensors can generate images with vastly different appearances and qualities. Hence, several problems have been faced. First, the crater detection problem from elevation images, second, the crater matching problem with known craters, the spacecraft position estimation problem from retrieved matches, and its integration with a navigation filter. The first problem was countered with the robust approach of deep learning. Then, a crater matching algorithm based on geometric descriptors was developed to solve the pattern recognition problem. Finally, a position estimation algorithm was integrated with an Extended Kalman Filter, built with a Keplerian propagator. This key choice highlights the performance achieved by the developed system that could benefit from more accurate propagators. FederNet system has been validated with an experimental analysis on real elevation images. Results showed that FederNet is capable to cruise with a navigation accuracy below 400 meters when a sufficient number of well-distributed craters is available for matching. FederNet capabilities can be further improved with higher resolution data and a data fusion integration with other sensor measurements, such as the lunar GPS, nowadays under investigation by many researchers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call