Abstract

Ground-based measurements of land-surface temperature (LST) performed in a homogeneous site of rice crops close to Valencia, Spain, were used for the validation of the calibration and the atmospheric correction of the Landsat-7 Enhanced Thematic Mapper Plus (ETM+) thermal band. Atmospheric radiosondes were launched at the test site around the satellite overpasses. Field-emissivity measurements of the near-full-vegetated rice crops were also performed. Seven concurrences of Landsat-7 and ground data were obtained in July and August 2004-2007. The ground measurements were used with the MODTRAN-4 radiative transfer model to simulate at-sensor radiances and brightness temperatures, which were compared with the calibrated ETM+ observations over the test site. For the cases analyzed here, the differences between the simulated and ETM+ brightness temperatures show an average bias of 0.6 K and a rootmean-square difference (rmsd) of ±0.8 K. The ground-based measurements were also used for the validation of LSTs derived from ETM+ at-sensor radiances with atmospheric correction calculated from the following: 1) the local-radiosonde profiles and 2) the operational atmospheric-correction tool available at http://atmcorr.gsfc.nasa.gov. For the first case, the differences between the ground and satellite LSTs ranged from -0.6 to 1.4 K, with a mean bias of 0.7 K and an rmsd = ±1.0 K. For the second case, the differences ranged between -1.8 and 1.3 K, with a zero average bias and an rmsd = ±1.1 K. Although the validation cases are few and limited to one land cover at morning and summer, results show the good LST accuracy that can be achieved with ETM+ thermal data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call