Abstract

Autonomous aerial refueling requires a high level of accuracy and integrity. A great deal of research has been conducted in this problem area and has successfully demonstrated techniques with sufficient accuracy for fully autonomous refueling. Unfortunately, significantly less research has focused on the integrity of the relative navigation solution, especially in the presence of degraded sensor operation. In this paper, we present an image-based relative navigation algorithm which uses a predictive rendering technique to determine the relative pose of a lead aircraft using a monocular imaging camera. The predictive rendering algorithm is developed using image processing techniques and the performance of the algorithm is evaluated from an observability perspective using flight test data. Three variations of the algorithm are analyzed: sum squared difference, magnitude of gradient, and gradient with threshold. The observability of translation and rotation errors is discussed for all three techniques, noting any significant issues for each approach. Overall, the predictive rendering algorithm is shown to provide an accurate estimate of the relative position and attitude to a lead aircraft using a monocular camera. In addition, the algorithm can be tuned to provide a measurement that emphasizes precision versus convergence stability based on the mission requirements.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call