Abstract

Satellite Image Time Series (SITS) is a data set that includes satellite images across several years with a high acquisition rate. Radiometric normalization is a fundamental and important preprocessing method for remote sensing applications using SITS due to the radiometric distortion caused by noise between images. Normalizing the subject image based on the reference image is a general strategy when using traditional radiometric normalization methods to normalize multi-temporal imagery (usually two or three scenes in different time phases). However, these methods are unsuitable for calibrating SITS because they cannot minimize the radiometric distortion between any pair of images in SITS. The existing relative radiometric normalization methods for SITS are based on linear assumptions, which cannot effectively reduce nonlinear radiometric distortion caused by continuously changing noise in SITS. To overcome this problem and obtain a more accurate SITS, we propose a nonlinear radiometric normalization model (NMAG) for SITS based on Artificial Neural Networks (ANN) and Greedy Algorithm (GA). In this method, GA is used to determine the correction order of SITS and calculate the error between the image to be corrected and normalized images, which avoids the selection of a single reference image. ANN is used to obtain the optimal solution of error function, which minimizes the radiometric distortion between different images in SITS. The SITS composed of 21 Landsat-8 images in Tianjin, China, from October 2017 to January 2019 was selected to test the method. We compared NMAG with other two contrast methods (Contrast Method 1 (CM1) and Contrast Method 2 (CM2)), and found that the average root mean square error (μRMSE) of NMAG (497.22) is significantly smaller than those of CM1 (641.39) and CM2 (543.47), and the accuracy of normalized SITS obtained using NMAG increases by 22.4% and 8.5% compared with CM1 and CM2, respectively. These experimental results confirm the effectiveness of NMAG in reducing radiometric distortion caused by continuously changing noise between images in SITS.

Highlights

  • Satellite Image Time Series (SITS) is a data set that includes satellite images across several years with a high acquisition rate

  • NMAG) for SITS based on Greedy Algorithm (GA) and Artificial Neural Networks (ANN)

  • We propose a nonlinear radiometric normalization method NMAG for SITS based on GA and ANN

Read more

Summary

Introduction

Satellite Image Time Series (SITS) is a data set that includes satellite images across several years with a high acquisition rate. SITS can provide abundant information to describe temporal changes in the generation and development of ground features in an area [1]. It has been used as an important data source in many fields such as environmental monitoring, land cover change monitoring, crop growth monitoring, and so on [2,3,4,5]. The temporal information extracted from SITS is inevitably disturbed by noise unrelated to ground features, such as atmospheric absorption and scattering, sensortarget illumination geometry, sensor calibration, etc., which lead to inaccurate results in remote sensing applications [6].

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.