Abstract

In 2009, the Lunar Reconnaissance Orbiter (LRO) launched with the Lunar Orbiter Laser Altimeter (LOLA), an instrument that precisely measures the Moon's surface elevation. LOLA is recorded in Lunar polar tracks, making this data particularly sparse in the equatorial regions where NASA has large sets of orbital images captured over the last five decades. The coregistration of orbital images (often taken from an imprecise spacecraft camera pose) and the precise but sparse LOLA measurements is crucial in building large scale, accurate lunar maps that support current and near term NASA and international space agencies' missions to the Moon. In this paper we introduce a novel algorithm for matching orbital images captured during Apollo 15, 16 and 17 missions with LIDAR data captured by the LOLA instrument. The surface normals extracted from each LOLA shot, the Apollo Metric camera pose, Sun position at image caption time and the lunar albedo are used to estimate a synthetic orbital image used as reference. We then use the Gauss-Newton algorithm to precisely align the actual (Apollo) orbital image to this reference image to progressively higher resolution layers of the image pyramid.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call