Abstract
The normalized difference vegetation index (NDVI) is a key parameter in precision agriculture. It has been used globally since the 1970s as a proxy to monitor crop growth and correlates to the crop coefficient (Kc), leaf area index (LAI), crop cover, and more. Yet, it is susceptible to clouds and other atmospheric conditions that might alter the crop’s real NDVI value. Synthetic Aperture Radar (SAR), on the other hand, can penetrate clouds and is hardly affected by atmospheric conditions, but it is sensitive to the physical structure of the crop and therefore does not give a direct indication of the NDVI. Several SAR indices and methods have been suggested to estimate NDVIs via SAR; however, they tend to work for local spatial and temporal conditions and do not work well globally. This is because they are not flexible enough to capture the changing NDVI–SAR relationship throughout the crop-growing season. This study suggests a new method for converting Sentinel-1 to NDVIs for Agricultural Fields (SNAF) by utilizing a hyperlocal machine learning approach. This method generates multiple on-the-fly disposal field- and time-specific models for every available Sentinel-1 image across 2021. Each model learns the field-specific NDVI (from Sentinel-2 and Landsat-8) –SAR (Sentinel-1) relationship based on recent NDVI and SAR time series and consequently estimates the optimal NDVI value from the current SAR image. The SNAF was tested on 548 commercial fields from 18 countries with 28 crop types and, based on 6880 paired NDVI–SAR images, achieved an RMSE, bias, and R2 of 0.06, 0.00, and 0.92, respectively. The outcome of this study aspires to a persistent seamless stream of NDVI values, regardless of the atmospheric conditions, illumination, or local conditions, which can assist in agricultural decision making.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.