Abstract

Accurate and precise location of trees from data acquired under-the-canopy is challenging and time-consuming. However, current forestry practices would benefit tremendously from the knowledge of tree coordinates, particularly when the information used to position them is acquired with inexpensive sensors. Therefore, the objective of our study is to geo-reference trees using point clouds created from the images acquired below canopy. We developed a procedure that uses the coordinates of the trees seen from above canopy to position the same trees seen below canopy. To geo-reference the trees from above canopy we captured images with an unmanned aerial vehicle. We reconstructed the trunk with photogrammetric point clouds built with a structure–from–motion procedure from images recorded in a circular pattern at multiple locations throughout the stand. We matched the trees segmented from below canopy with the trees extracted from above canopy using a non-rigid point-matching algorithm. To ensure accuracy, we reduced the number of matching trees by dividing the trees segmented from above using a grid with 50 m cells. Our procedure was implemented on a 7.1 ha Douglas-fir stand from Oregon USA. The proposed procedure is relatively fast, as approximately 600 trees were mapped in approximately 1 min. The procedure is sensitive to the point density, directly impacting tree location, as differences larger than 2 m between the coordinates of the tree top and the bottom part of the stem could lead to matching errors larger than 1 m. Furthermore, the larger the number of trees to be matched the higher the accuracy is, which could allow for misalignment errors larger than 2 m between the locations of the trees segmented from above and below.

Highlights

  • Many forest management decisions are based on attributes measured under the canopy, such as diameter at breast height or height to the base of the live crown [1,2]

  • We aimed at developing a procedure for geo-referencing 3D point clouds representing the lower section of the trunk that is relatively inexpensive, fast, and accurate using only RBG images

  • Multiple photogrammetric point clouds are developed using structure-from-motion implemented in Agisoft [22], one acquired with an unmanned aerial vehicle (i.e., DJI Phantom 3 Professional) that will be used for tree segmentation from above canopy, and multiple at different locations throughout the stand, which will serve to identify the absolute location of the trees as seen from below canopy

Read more

Summary

Introduction

Many forest management decisions are based on attributes measured under the canopy, such as diameter at breast height (dbh) or height to the base of the live crown [1,2]. The technological developments in material sciences, sensors, information technology, and harvesting equipment allows fast and accurate estimation of the attributes relevant to forest activities while moving under the canopy, dbh, taper, and total height. Whereas procedures for precise, accurate, and fast estimates of dbh and total height are available for a reduced set of trees (i.e., plots or samples), difficulties are encountered when estimates of all trees in a stand are needed. Total height of each tree in a stand can be relatively easy estimated from point clouds, lidar or photogrammetric, there are at least two attributes that are difficult to estimate accurately and precisely for all trees: dbh and location. The effort placed in estimation of tree location mirror the one for tree size, but the success was clearly less impressive, as the accuracy is still measured in meters [8]

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.