Abstract

The development of near-surface remote sensing requires the accurate extraction of leaf area index (LAI) from networked digital cameras under all illumination conditions. The widely used directional gap fraction model is more suitable for overcast conditions due to the difficulty to discriminate the shaded foliage from the shadowed parts of images acquired on sunny days. In this study, a new LAI extraction method by the sunlit foliage component from downward-looking digital photography under clear-sky conditions is proposed. In this method, the sunlit foliage component was extracted by an automated image classification algorithm named LAB2, the clumping index was estimated by a path length distribution-based method, the LAD and G function were quantified by leveled digital images and, eventually, the LAI was obtained by introducing a geometric-optical (GO) model which can quantify the sunlit foliage proportion. The proposed method was evaluated at the YJP site, Canada, by the 3D realistic structural scene constructed based on the field measurements. Results suggest that the LAB2 algorithm makes it possible for the automated image processing and the accurate sunlit foliage extraction with the minimum overall accuracy of 91.4%. The widely-used finite-length method tends to underestimate the clumping index, while the path length distribution-based method can reduce the relative error (RE) from 7.8% to 6.6%. Using the directional gap fraction model under sunny conditions can lead to an underestimation of LAI by (1.61; 55.9%), which was significantly outside the accuracy requirement (0.5; 20%) by the Global Climate Observation System (GCOS). The proposed LAI extraction method has an RMSE of 0.35 and an RE of 11.4% under sunny conditions, which can meet the accuracy requirement of the GCOS. This method relaxes the required diffuse illumination conditions for the digital photography, and can be applied to extract LAI from downward-looking webcam images, which is expected for the regional to continental scale monitoring of vegetation dynamics and validation of satellite remote sensing products.

Highlights

  • Leaf area index (LAI) is a key biophysical variable to characterize vegetation canopy structure and functioning in most ecosystem productivity and land surface process models [1,2]

  • The sunlit foliage component (PT) extracted by the automated LAB2 image classification algorithm from 22 images in the principal plane and the cross plane under sunny conditions were compared with that determined by the 3D software as in Figures 4 and 5

  • The poor performance of the gap fractional model on sunny days limited the extraction of leaf area index (LAI) by near-surface remote sensing under different illumination conditions

Read more

Summary

Introduction

Leaf area index (LAI) is a key biophysical variable to characterize vegetation canopy structure and functioning in most ecosystem productivity and land surface process models [1,2]. The accuracy of remotely sensed LAI can be affected by the land surface heterogeneities, the impact of clouds and aerosols in the atmospheric correction, the uncertainties from the forward model used to create the look-up tables, and the saturation of optical signals over dense canopies when the lower layer is obscured by the upper layer[5,6,7,8]. LAI can be measured through direct and indirect methods in the field campaign [9]. Direct LAI measurements include destructive sampling and non-harvest litter traps for deciduous forests, which are the most accurate, but are extremely labor-intensive and time-consuming [9,10,11]. Indirect methods using optical radiometric or imaging sensors, e.g., LAI-2000 Plant Canopy Analyzer (PCA), Digital

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.