Abstract

Point clouds are widely used in ground-based forest scanning using LiDAR and stereo cameras. Point clouds
 often suffer from noise outliers and artifacts that distort data. Hardware accuracy and quality of the initial point cloud
 during ground scanning of a forest area can be improved by using scanners with higher expansion, as well as using
 photogrammetry or additional sensors. To eliminate noise, software methods can be used: point filtering, smoothing,
 statistical methods and reconstruction algorithms. A new approach to filtering the noise of the scanned forest area is based
 on the analysis of the values of the color components in the YCbCr- and L*a*b- spaces. The properties of the YCbCrand L*a*b-color models were investigated and threshold values for classifying points as noise or object depending on
 their distance to the centroids were determined. The use of a combined (YCbCr | L*a*b) filter on the point cloud reduced
 the number of points to 38 963 (17.41% of the original number). When calibrating the camera and LiDAR based on the
 (YCbCr | L*a*b) filter, the total average value of translation errors was 0.0247 m, rotation 6,244 degrees, reprojection
 8,385 pixels. The noise-filtering method (YCbCr | L*a*b) shows high accuracy and reliability in removing noise and
 maintaining the integrity of objects in the point cloud, which will allow the data obtained on unmanned machines to be
 used later when performing logging operations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.