Abstract

An efficient method based on plane segmentation and projection is proposed to extract as complete boundary (straight) line segments as possible. To efficiently detect boundary points, 3-D planar point clouds (patches) generated by plane segmentation are converted into 2-D images by graphical projection to avoid point-based nearest neighbors’ searching. To preserve boundary details, the projection resolution for each plane is determined through accurate statistics of average local intervals between points in voxels composing corresponding planar point clouds. To alleviate erroneous line segmentation extraction caused by uneven densities of planar point clouds, improved mathematical morphological operation, whose structural element order is self-adaptive to local point density, is proposed for crack inpainting of projected images. As for extraction of 3-D boundary line segments, the back-projection results of 2-D boundary line segments detected from images are utilized as references for clustering of boundary points, and a 3-D straight line fitting algorithm based on recursive weighted least squares is employed to reduce the effect of boundary noise. A series of refinement strategies are further presented to control the quality of extracted 3-D line segments. Experimental results demonstrate that the proposed line segment extraction method can provide efficient and fine processing results, as well as automatic processing ability, which satisfies the efficient processing demand for large-scale point clouds.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.