Abstract
The normalized difference vegetation index (NDVI) is crucial to many sustainable agricultural practices such as vegetation monitoring and health evaluation. However, optical remote sensing data often suffer from a large amount of missing information due to sensor failures and harsh atmospheric conditions. The synthetic aperture radar (SAR) offers a new approach to filling in missing optical data based on its excessive revisit density and its potential to image without interference from clouds and rain. Due to the difference in imaging mechanisms between SAR and optical sensors, it is very difficult to fuse the data. This paper developed an advanced deep learning Spatio-temporal fusion method, i.e., Transformer Temporal-spatial Model (TTSM), to synergize the SAR and optical time-series to reconstruct vegetation NDVI time series in cloudy regions. The proposed multi-head attention and end-to-end architecture achieved satisfactory accuracy (R2 greater than 0.88), outperforming the existing deep learning solutions. Extensive experiments were carried out to evaluate the TTSM method on large-scale areas (the spatial scale of megapixels) in northeast China with the main vegetation types of crops and forests. The R2, SSIM, RMSE, NRMSE, and MAE of our prediction results were 0.88, 0.80, 0.06, 0.16, and 0.05, respectively. The influence of training sample size was investigated through a transfer learning study, and the result indicated that the model had good generalizability. Overall, our proposed method can fill in the gap of optical data at an extensive regional scope over the vegetated area using SAR.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Applied Earth Observation and Geoinformation
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.