Abstract
In 2009, the Lunar Reconnaissance Orbiter (LRO) launched with the Lunar Orbiter Laser Altimeter (LOLA), an instrument that precisely measures the Moon's surface elevation. LOLA is recorded in Lunar polar tracks, making this data particularly sparse in the equatorial regions where NASA has large sets of orbital images captured over the last five decades. The coregistration of orbital images (often taken from an imprecise spacecraft camera pose) and the precise but sparse LOLA measurements is crucial in building large scale, accurate lunar maps that support current and near term NASA and international space agencies' missions to the Moon. In this paper we introduce a novel algorithm for matching orbital images captured during Apollo 15, 16 and 17 missions with LIDAR data captured by the LOLA instrument. The surface normals extracted from each LOLA shot, the Apollo Metric camera pose, Sun position at image caption time and the lunar albedo are used to estimate a synthetic orbital image used as reference. We then use the Gauss-Newton algorithm to precisely align the actual (Apollo) orbital image to this reference image to progressively higher resolution layers of the image pyramid.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.