Abstract

With the onset of autonomous spacecraft formation flying missions, the ability of satellites to autonomously navigate relatively to other space objects has become essential. To implement spacecraft relative navigation, relative measurements should be taken, and processed using relative state estimation. An efficient way to generate such information is by using vision-based measurements. Cameras are passive, low-energy, and information-rich sensors that do not actively interact with other space objects. However, pointing cameras with a conventional field-of-view to other space objects requires much a-priori initialization data; in particular, dedicated attitude maneuvers are needed, which may interfere with the satellite’s main mission. One way to overcome these difficulties is to use an omnidirectional vision sensor, which has a 360-degree horizontal field of view. In this work, we present the development of an omnidirectional vision sensor for satellites, which can be used for spacecraft relative navigation, formation flying, and space situational awareness. The study includes the development of the measurement equations, dynamical models, and state estimation algorithms, as well as a numerical study, an experimental investigation, and a space scalability analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call