Abstract

Recently, Armston et al. (2013) have demonstrated that a new, physically-based method for direct retrieval of canopy gap probability Pgap from waveform lidar can improve the estimation of Pgap over discrete return lidar data. The success of the approach was demonstrated in a savanna woodland environment in Australia. The huge advantage of this method is that it uses the data themselves to solve for the canopy contrast term i.e. the ratio of the reflectance from crown and ground, ρv/ρg. In this way the method avoids local calibration that is typically required to overcome differences in either ρv or ρg. To be more generally useful the method must be demonstrated on different sites and in the presence of slope and different sensor and survey configurations. If it is robust to these things, slope in particular, then we would suggest it is likely to be widely useful. Here, we test the robustness of the retrieval of Pgap from waveform lidar using the Watershed Allied Telemetry Experimental Research dataset, over the Heihe River Basin region of China. The data contain significant canopy, terrain and survey variations, presenting a rather different set of conditions to those previously used. Results show that ρv/ρg is seen to be stable across all flights and for all levels of spatial aggregation. This strongly supports the robustness of the new Pgap retrieval method, which assumes that this relationship is stable. A comparison between Pgap estimated from hemiphotos and from the waveform lidar showed agreement with Pearson correlation coefficient R=0.91. The waveform lidar-derived estimates of Pgap agreed to within 8% of values derived from hemiphotos, with a bias of 0.17%. The new waveform model was shown to be stable across different off-nadir scan angles and in the presence of slopes up to 26° with R≥0.85 in all cases. We also show that the waveform model can be used to calculate Pgap using just the mean value of canopy returns, assuming that their distribution is unimodal. Lastly, we show that the method can also be applied to discrete return lidar data, albeit with slightly lower accuracy and higher bias, allowing Pgap comparisons with previously-collected lidar datasets. Our results show the new method should be applicable for estimating Pgap robustly across large areas, and from lidar data collected at different times and using different systems; an increasingly important requirement.

Highlights

  • Directional gap probability, Pgap(θ), is defined as the probability of a light beam of infinitesimal width at zenith angle θ to the local normal, being directly transmitted through a vegetation canopy (Armston et al, 2013)

  • The aim of this study was to investigate if a newly-proposed method of estimating canopy gap fraction Pgap from waveform lidar is robust across varying terrain, canopy and sensor configurations

  • This method assumes that the ratio of lidar returns from canopy and ground ρv/ρg is stable, but unlike other methods does not require a priori knowledge of either ρg or ρv separately, as, crucially, it solves for the ratio ρv/ρg using the data alone rather than requiring local calibration

Read more

Summary

Introduction

Directional gap probability, Pgap(θ), is defined as the probability of a light beam of infinitesimal width at zenith angle θ to the local normal, being directly transmitted through a vegetation canopy (Armston et al, 2013). The importance of Pgap(θ) is its relationship to radiation interception within the canopy and other canopy structure parameters, like LAI and above-ground biomass (Campbell & Norman, 1989; NiMeister et al, 2010). These latter properties may be modelled using different expressions, combinations or spatial variance of canopy height and Pgap(θ), since the Pgap(θ) represents the integrated effect of several scale-dependent canopy structural properties

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.