Abstract

This paper aims to investigate the capabilities of exploiting optical line-of-sight navigation using star trackers. First, a synthetic image simulator is developed to generate realistic images, which is later exploited to test the star tracker's performance. Then, generic considerations regarding attitude estimation are drawn, highlighting how the camera's characteristics influence the accuracy of the estimation. The full attitude estimation chain is designed and analyzed in order to maximize the performance in a deep-space cruising scenario. After that, the focus is shifted to the actual planet-centroiding algorithm, with particular emphasis on the illumination compensation routine, which is shown to be fundamental to achieving the required navigation accuracy. The influence of the center of the planet within the singular pixel is investigated, showing how this uncontrollable parameter can lower performance. Finally, the complete algorithm chain is tested with the synthetic image simulator in a wide range of scenarios. The final promising results show that with the selected hardware, even in the higher noise condition, it is possible to achieve a direction's azimuth and elevation angle error in the order of 1-2arc sec for Venus, and below 1arc sec for Jupiter, for a spacecraft placed at 1AU from the Sun. These values finally allow for a positioning error below 1000km, which is in line with the current non-autonomous navigation state-of-the-art.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call