Abstract

Abstract. Point clouds classification is the basis for 3D spatial information extraction and applications. The point-clusters-based methods are proved to be more efficient and accurate than the point-based methods, however, the precision of the classification is significantly affected by the segmentation errors. The traditional single-scale point clouds segmentation methods cannot segment complex objects well in urban scenes which will result in inaccurate classification. In this paper, a new multi-scale point clouds segmentation method for urban scene point clouds classification is proposed. The proposed method consists of two stages. In the first stage, to ease the segmentation errors caused by density anisotropy and unreasonable neighborhood, a multi-resolution supervoxels segmentation algorithm is proposed to segment the objects into small-scale clusters. Firstly, the point cloud is segmented into initial supervoxels based on geometric and quantitative constraints. Secondly, robust neighboring relationships between supervoxels are obtained based on kd-tree and octree. Furthermore, the resolution of supervoxels in the planar and low-density region is optimized. In the second stage, planar supervoxels are clustered into the large-scale planar point clusters based on the region growing algorithm. Finally, a mix of small-scale and large-scale point clusters is obtained for classification. The performance of the segmentation method in classification is compared with other segmentation methods. Experimental results revealed that the proposed segmentation method can significantly improve the efficiency and accuracy of point clouds classification than other segmentation methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.