Abstract

Regions with excessive cloud cover lead to limited feasibility of applying optical images to monitor crop growth. In this study, we built an upsampling moving window network for regional crop growth monitoring (UMRCGM) model to estimate the two key biophysical parameters (BPs), leaf area index (LAI) and canopy chlorophyll content (CCC) during the main growth period of winter wheat by using Sentinel-1 Synthetic Aperture Radar (SAR) and Sentinel-3 optical images. Sentinel-1 imagery is unaffected by cloudy weather and Sentinel-3 imagery has a wide width and short revisit period, the organic combination of the two will greatly improve the ability to monitor crop growth at a regional scale. The impact of two different types of SAR information (intensity and polarization) on the estimation of the two BPs was further analyzed. The UMRCGM model optimized the correspondence between inputs and outputs, it had more accurate LAI and CCC estimates compared with the three classical machine learning models, and had the highest accuracy at the green-up stage of winter wheat, followed by the jointing stage and the heading-filling stage, and the lowest accuracy was found at the milk maturity stage. The estimation accuracies of CCC were slightly higher than that of LAI for the first three growth stages of winter wheat, while lower than that of LAI for the milk maturity stage. This study proposes a new method for regional BPs (especially for CCC) estimation by combining SAR and optical imagery with large differences in spatial resolution under a deep learning framework.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call