Abstract

Remote sensing data are considered as one of the primary data sources for precise agriculture. Several studies have demonstrated the excellent capability of radar and optical imagery for crop mapping and biophysical parameter estimation. This paper aims at modeling the crop biophysical parameters, e.g., Leaf Area Index (LAI) and biomass, using a combination of radar and optical Earth observations. We extracted several radar features from polarimetric Synthetic Aperture Radar (SAR) data and Vegetation Indices (VIs) from optical images to model crops’ LAI and dry biomass. Then, the mutual correlations between these features and Random Forest feature importance were calculated. We considered two scenarios to estimate crop parameters. First, Machine Learning (ML) algorithms, e.g., Support Vector Regression (SVR), Random Forest (RF), Gradient Boosting (GB), and Extreme Gradient Boosting (XGB), were utilized to estimate two crop biophysical parameters. To this end, crops’ dry biomass and LAI were estimated using three input data; (1) SAR polarimetric features; (2) spectral VIs; (3) integrating both SAR and optical features. Second, a deep artificial neural network was created. These input data were fed to the mentioned algorithms and evaluated using the in-situ measurements. These observations of three cash crops, including soybean, corn, and canola, have been collected over Manitoba, Canada, during the Soil Moisture Active Validation Experimental 2012 (SMAPVEX-12) campaign. The results showed that GB and XGB have great potential in parameter estimation and remarkably improved accuracy. Our results also demonstrated a significant improvement in the dry biomass and LAI estimation compared to the previous studies. For LAI, the validation Root Mean Square Error (RMSE) was reported as 0.557 m2/m2 for canola using GB, and 0.298 m2/m2 for corn using GB, 0.233 m2/m2 for soybean using XGB. RMSE was reported for dry biomass as 26.29 g/m2 for canola utilizing SVR, 57.97 g/m2 for corn using RF, and 5.00 g/m2 for soybean using GB. The results revealed that the deep artificial neural network had a better potential to estimate crop parameters than the ML algorithms.

Highlights

  • Due to rapid population growth and climate changes, global food security and agriculture production risks have been increased [1]

  • The impact of optical Vegetation Indices (VIs), UAVSAR polarimetric features, and integrating them on the accuracy of retrieving dry biomass and Leaf Area Index (LAI) using four machine learning regression models is assessed

  • For corn dry biomass and LAI, the deep artificial neural network (ANN) provided the Root Mean Square Error (RMSE) of 54.43 g/m2 and 0.273 m2 /m2, respectively (Figure 6c,d)

Read more

Summary

Introduction

Due to rapid population growth and climate changes, global food security and agriculture production risks have been increased [1]. Information about annual crop production is vital for global and local food security. As the input data in crop models, crop biophysical parameters are estimated using direct and indirect methods. The direct method consists of a ground measuring of the plant’s parameters. These methods are usually destructive, costly, timeconsuming, and complicated [5]. Remote sensing provides vital information on crop growth conditions over agricultural areas due to its extensive coverage and spatio-temporal resolution [6]. To this end, remote sensing imagery could be suitable for accurate crop monitoring

Objectives
Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.