Abstract

Physically based radiative transfer models (RTMs) are widely used in Earth observation to understand the radiation processes occurring on the Earth's surface and their interactions with water, vegetation, and atmosphere. Through continuous improvements, RTMs have increased in accuracy and representativity of complex scenes at expenses of an increase in complexity and computation time, making them impractical in various remote sensing applications. To overcome this limitation, the common practice is to precompute large lookup tables (LUTs) for their later interpolation. To further reduce the RTM computation burden and the error in LUT interpolation, we have developed a method to automatically select the minimum and optimal set of input-output points (nodes) to be included in an LUT. We present the gradient-based automatic LUT generator algorithm (GALGA), which relies on the notion of an acquisition function that incorporates: 1) the Jacobian evaluation of an RTM and 2) the information about the multivariate distribution of the current nodes. We illustrate the capabilities of GALGA in the automatic construction and optimization of MODTRAN-based LUTs of different dimensions of the input variables space. Our results indicate that when compared with a pseudorandom homogeneous distribution of the LUT nodes, GALGA reduces:1) the LUT size by >24%; 2) the computation time by 27%; and 3) the maximum interpolation relative errors by at least 10%. It is concluded that an automatic LUT design might benefit from the methodology proposed in GALGA to reduce interpolation errors and computation time in computationally expensive RTMs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call