Abstract

Leaf area index (LAI) and biomass are important indicators that reflect the growth status of maize. The optical vegetation indices and the synthetic-aperture radar (SAR) backscattering coefficient are commonly used to estimate the LAI and biomass. However, previous studies have suggested that spectral features extracted from a single pixel have a poor ability to describe the canopy structure. In this paper, we propose a method for estimating LAI and biomass by combining spectral and texture features. Specifically, LAI, biomass and remote-sensing data were collected from the jointing, trumpet, flowering, and filling stages of maize. Then we formed six remote-sensing feature matrices using the spectral and texture features extracted from the remote sensing data. Principal component analysis (PCA) was used to remove noise and to reduce and integrate the multi-dimensional features. Multiple linear regression (MLR) and support vector regression (SVR) methods were used to build the estimation models. Tenfold cross-validation was adopted to verify the effectiveness of the proposed method. The experimental results show that using the texture features of both optical and SAR data improves the estimation accuracy of LAI and biomass. In particular, SAR texture features greatly improve the estimation accuracy of biomass. The estimation model constructed by combining spectral and texture features of optical and SAR data achieves the best performance (highest coefficient of determination ($R^{2}$ ) and lowest root mean square error (RMSE)). Specifically, we conclude that the best window sizes for extracting texture features from optical and SAR data are $3\times 3$ and $7 \times 7$ , respectively. SVR is more suitable for estimating the LAI and biomass of maize than MLR. In addition, after adding texture features, we observed a significant improvement in the accuracy of estimation of LAI and biomass for the growth stages, which have a larger variation in the extent of the canopy. Overall, this work shows the potential of combining spectral and texture features for improving the estimation accuracy of LAI and biomass in maize.

Highlights

  • The leaf area index (LAI) and biomass are important indicators for monitoring the growth of maize [1]–[3]

  • In this study, the method for estimating maize LAI and biomass by combining spectral and texture features was proposed through exploring various effects on estimation of maize LAI and biomass, including different satellite data, window sizes of feature extraction, modeling methods and growth stages

  • The spectral features include reflectance and vegetation indices extracted from optical data, and the backscattering coefficient and polarization metrics extracted from synthetic-aperture radar (SAR) data

Read more

Summary

Introduction

The leaf area index (LAI) and biomass are important indicators for monitoring the growth of maize [1]–[3]. They provide important information for monitoring temperature stress, water stress, pest levels, early yields, etc. The remote-sensing inversion method reduces the amount of time and labor needed to obtain LAI and biomass [14], [15], and it is. Spectral reflectance and vegetation indices extracted from optical data and the backscattering coefficient and polarization metrics extracted from synthetic-aperture radar (SAR) data are widely used for estimating LAI and biomass [23]–[27]. The normalized difference vegetation index (NDVI), the most commonly used

Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.