Abstract

The objective of this paper is to develop a vision-based terminal guidance system for sensorless missiles. Specifically, monocular vision-based relative navigation and robust control methods are developed for a sensorless missile to intercept a ground target maneuvering with unknown time-varying velocity. A mobile wireless sensor and actor network is considered wherein a moving airborne monocular camera (e.g., attached to an aircraft) provides image measurements of the missile (actor) while another moving monocular camera (e.g., attached to a small UAV) tracks a ground target. The challenge is to express the unknown time-varying target position in the time-varying missile frame using image feedback from cameras moving with unknown trajectories. In a novel relative navigation approach, assuming the knowledge of a single geometric length on the missile, the time-varying target position is obtained by fusing the daisy-chained image measurements of the missile and the target into a homography-based Euclidean reconstruction method. The three-dimensional interception problem is posed in pursuit guidance, proportional navigation, and the proposed hybrid guidance framework. Interestingly, it will be shown that by appropriately defining the error system a single control structure can be maintained across all the above guidance methods. The control problem is formulated in terms of target dynamics in a ‘virtual’ camera mounted on the missile, which enables design of an adaptive nonlinear visual servo controller that compensates for the unknown time-varying missile–target relative velocity. Stability and zero-miss distance analysis of the proposed controller is presented, and a high-fidelity numerical simulation verifies the performance of the guidance laws.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call