Abstract
As a common form of light detection and ranging (LiDAR) in forestry applications, the canopy height model (CHM) provides the elevation distribution of aboveground vegetation. A CHM is traditionally generated by interpolating all the first LiDAR echoes. However, the first echo cannot accurately represent the canopy surface, and the resulting large amount of noise (data pits) also reduce the CHM quality. Although previous studies concentrate on many pit-filling methods, the applicability of these methods in high-resolution unmanned aerial vehicle laser scanning (UAVLS)-derived CHMs has not been revealed. This study selected eight widely used, recently developed, representative pit-filling methods, namely first-echo interpolation, smooth filtering (mean, medium and Gaussian), highest point interpolation, pit-free algorithm, spike-free algorithm and graph-based progressive morphological filtering (GPMF). A comprehensive evaluation framework was implemented, including a quantitative evaluation using simulation data and an additional application evaluation using UAVLS data. The results indicated that the spike-free algorithm and GPMF had excellent visual performances and were closest to the real canopy surface (root mean square error (RMSE) of simulated data were 0.1578 m and 0.1093 m, respectively; RMSE of UAVLS data were 0.3179 m and 0.4379 m, respectively). Compared with the first-echo method, the accuracies of the spike-free algorithm and GPMF improved by approximately 23% and 22%, respectively. The pit-free algorithm and highest point interpolation method also have advantages in high-resolution CHM generation. The global smooth filter method based on the first-echo CHM reduced the average canopy height by approximately 7.73%. Coniferous forests require more pit-filling than broad-leaved forests and mixed forests. Although the results of individual tree applications indicated that there was no significant difference between these methods except the median filter method, pit-filling is still of great significance for generating high-resolution CHMs. This study provides guidance for using high-resolution UAVLS in forestry applications.
Highlights
IntroductionAs an active remote sensing technology, light detection and ranging (LiDAR) can penetrate the canopy to obtain the vertical structure of a forest, and it has been widely used in forest inventory analyses [1,2]
The results showed that the median filter and Gaussian filter have the highest accuracies in the 5 × 5 window size, while the mean filter has the highest accuracy in the 3 × 3 window size
For a hemisphere with 10% pits, a smaller window is more suitable for Gaussian filtering, and large pits use a large window for Gaussian filtering with higher accuracy
Summary
As an active remote sensing technology, light detection and ranging (LiDAR) can penetrate the canopy to obtain the vertical structure of a forest, and it has been widely used in forest inventory analyses [1,2]. LiDAR has two common data formats: one is the original discrete point cloud, and the other is the spatially continuous raster surface obtained from raw laser points, such as the digital elevation model (DEM) and digital surface model (DSM) [3]. The canopy height model (CHM) is usually constructed by subtracting the DEM from the DSM or by interpolating the normalized point cloud, which is a direct manifestation of the absolute height distribution of the vegetation canopy above the ground [4]. Compared with the original point cloud data, the storage volume of the Remote Sens.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.