Abstract

Abstract. Increasingly advanced and affordable close-range sensing techniques are employed by an ever-broadening range of users, with varying competence and experience. In this context a method was tested that uses photogrammetry and classification by machine learning to divide a point cloud into different surface type classes. The study site is a peat scarp 20 metres long in the actively eroding river bank of the Rotmoos valley near Obergurgl, Austria. Imagery from near-infra red (NIR) and conventional (RGB) sensors, georeferenced with coordinates of targets surveyed with a total station, was used to create a point cloud using structure from motion and dense image matching. NIR and RGB information were merged into a single point cloud and 18 geometric features were extracted using three different radii (0.02 m, 0.05 m and 0.1 m) totalling 58 variables on which to apply the machine learning classification. Segments representing six classes, dry grass, green grass, peat, rock, snow and target, were extracted from the point cloud and split into a training set and a testing set. A Random Forest machine learning model was trained using machine learning packages in the R-CRAN environment. The overall classification accuracy and Kappa Index were 98% and 97% respectively. Rock, snow and target classes had the highest producer and user accuracies. Dry and green grass had the highest omission (1.9% and 5.6% respectively) and commission errors (3.3% and 3.4% respectively). Analysis of feature importance revealed that the spectral descriptors (NIR, R, G, B) were by far the most important determinants followed by verticality at 0.1 m radius.

Highlights

  • In the past decades a step change in close range remote sensing technologies has allowed techniques such as photogrammetry to be employed by an increasingly diverse range of users, the specialist (Eltner et al, 2016; Westoby et al, 2012)

  • Imagery from near-infra red (NIR) and conventional (RGB) sensors, georeferenced with coordinates of targets surveyed with a total station, was used to create a point cloud using structure from motion and dense image matching

  • NIR and RGB information were merged into a single point cloud and 18 geometric features were extracted using three different radii (0.02 m, 0.05 m and 0.1 m) totalling 58 variables on which to apply the machine learning classification

Read more

Summary

Introduction

In the past decades a step change in close range remote sensing technologies has allowed techniques such as photogrammetry to be employed by an increasingly diverse range of users, the specialist (Eltner et al, 2016; Westoby et al, 2012). The inevitable result of this proliferation has been an abundance of high-quality data for which automated processes of classification have become a practical necessity (Grilli et al, 2017), since manual labelling and classification are cost- and time-demanding and unfeasible for large datasets In this context, at the 2019 Innsbruck Summer School, Obergurgl (Rutzinger et al, 2018, 2016), a team of researchers applied machine learning (ML) to a point cloud derived from dense image matching of a terrestrial photogrammetric survey. The fields of interest of the participants comprise a diversity of applications that can benefit from close-range sensing: from primary colonisation of recently deglaciated ground, through slope stability and evolution, to the surveying and interpretation of rarely preserved 700-million-year-old landforms These users represent some of the numerous examples that may benefit from common data manipulation techniques, allowing statistical data to be derived from classifications within point cloud data

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.