Abstract

Advanced Very High Resolution Radiometer (AVHRR) sensors provide a valuable data source for generating long-term global land surface temperature (LST). However, changes in the observation time that are caused by satellite orbit drift restrict their wide application. Here, a generalized split-window (GSW) algorithm was implemented to retrieve the LST from the time series AVHRR data. Afterwards, a novel orbit drift correction (ODC) algorithm, which was based on the diurnal temperature cycle (DTC) model and Bayesian optimization algorithm, was also proposed for normalizing the estimated LST to the same local time. This ODC algorithm is pixel-based and it only needs one observation every day. The resulting LSTs from the six-year National Oceanic and Atmospheric Administration (NOAA)-14 satellite data were validated while using Surface Radiation Budget Network (SURFRAD) in-situ measurements. The average accuracies for LST retrieval varied from −0.4 K to 2.0 K over six stations and they also depended on the viewing zenith angle and season. The simulated data illustrate that the proposed ODC method can improve the LST estimate at a similar magnitude to the accuracy of the LST retrieval, i.e., the root-mean-square errors (RMSEs) of the corrected LSTs were 1.3 K, 2.2 K, and 3.1 K for the LST with a retrieval RMSE of 1 K, 2 K, and 3 K, respectively. This method was less sensitive to the fractional vegetation cover (FVC), including the FVC retrieval error, size, and degree of change within a neighboring area, which suggested that it could be easily updated by applying other LST expression models. In addition, ground validation also showed an encouraging correction effect. The RMSE variations of LST estimation that were introduced by ODC were within ±0.5 K, and the correlation coefficients between the corrected LST errors and original LST errors could approach 0.91.

Highlights

  • Land surface temperature (LST), which is an important parameter for energy balance at regional and global scales, is defined as the radiometric temperature on the land surface [1]

  • After orbit drift correction (ODC), the root-mean-square errors (RMSEs) for the actual LSTs at 14:30 and the estimated LSTs at 14:30, i.e., the LSTs corrected from other moments, decreased to 2.5 K

  • In terms of LST retrieval, the RMSEs ranged from 4.1 K to 2.2 K and the overall Biases ranged from −0.4 K to 2.0 K over six sites, which suggested competitive accuracy when compared with current LST products

Read more

Summary

Introduction

Land surface temperature (LST), which is an important parameter for energy balance at regional and global scales, is defined as the radiometric temperature on the land surface [1]. As it is vulnerable to the surrounding environment, such as surface components, soil physicochemical characteristics, and albedo, this important variable clearly has spatial and temporal heterogeneity [3]. For this reason, satellite remote sensing provides the only way to obtain an LST with a high resolution and overall spatial distribution. With different assumptions and approximations of radiative transfer equations in the thermal infrared domain, many LST retrieval methods have been developed. The representative datasets include the Geostationary Operational Environmental Satellites (GOES) LST and the Spinning Enhanced Visible and InfraRed Imager (SEVIRI) LST products, which provide hourly LSTs at a 2~10 km spatial resolution for America after 2010 [21] and 15 min LSTs at a 3 km resolution for Europe after 2006 [22], respectively

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call