Abstract

Rugged terrain, as a large percentage of the Earth's terrestrial surface, is frequently reported to cause directionality of land surface thermal radiation (LSTR), and seriously affects the retrieval accuracy of land surface temperature (LST) and surface longwave radiation from satellite measurements. Therefore, modeling topographic effects on surface thermal anisotropy is essential to understand surface radiative processes. The directional brightness temperature (DBT) and equivalent brightness temperature (EBT) models at the pixel scale are proposed to indicate thermal anisotropy, considering viewing geometry, topographic effects, and subpixel variations based on the thermal infrared radiative transfer equation. A simulated data set of DBT and EBT at the 1-km resolution was obtained based on LST, emissivity, and terrain data with 30-m resolution. The terrain, coupled with solar and viewing geometries and subgrid variation, significantly affects the directionality of LSTR, and results in a remarkable bias between DBT and EBT. For the nadir observation, the bias is from −0.8 to 1 K, and reaches −5 to 2 K when viewing zenith angle becomes 50°. The maximal deviation is about 9 K over the most rugged mountains, which causes ${\text{57.6}}\;{\text{W/m}}^{2}$ bias of longwave radiation based on a 300 K blackbody. Furthermore, when LST is retrieved from DBT, the uncertainty of broadband emissivity of 0.01 causes LST bias of ∼0.35 K. The models are considered to be very helpful in exploring terrain-induced thermal anisotropy, and enlightening in reducing estimation bias of remote sensing products over complex terrain.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call