Abstract

This article describes an algorithm for reconstructing the motion of a radar platform relative to a scene, given video phase history and an estimate of the sensor’s average speed and altitude. This algorithm is a more robust alternative to previous geometric-invariant-based approaches, capable of producing motion estimates accurate enough for the formation of backprojection imagery. The algorithm is modular. In the first phase, the algorithm estimates the range to fixed locations in the scene as a function of time. In the second phase, these range tracks are used by a specialized solver to recover the relative positions of the tracked locations in the scene, along with the relative position of the sensor platform as a function of time. We demonstrate the effectiveness of the algorithm by forming an image from the large-scene Gotcha data set, without the use of GPS or inertial measurement unit data. The results are compared to those from another invariant-based algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call