Abstract
Object detection and recognition from LiDAR (Light Detection And Ranging) data has been a research topic in the fields of photogrammetry and computer vision. Unlike point clouds collected in well-controlled indoor environments, point clouds in urban environments are more complex due to complexity of the real world. For example, trees sometimes close to signs or buildings, which will cause occlusions in the point clouds. Current object detection or reconstruction algorithms will have problems when recognizing objects with severe occlusions caused by trees etc. In this paper, a robust vegetation removal method and a DBSCAN based pole-like object detection method are proposed. Based on observation that major difference between vegetation and other rigid objects is their penetrability with respect to LiDAR, we introduce a local roughness measure to differentiate rigid objects from non-rigid ones (vegetation in this paper). First, a local sphere with a small radius is generated for each input point. Three principal components of the local sphere are then calculated, and a plane is determined. The roughness is obtained through calculating the standard deviation of distances from all inside points to the plane by a weighted summation of the normalized distances. The further the point to the plane, the smaller the weight is. Finally, a graph cuts based method is introduced to classify the input point sets into two groups. The data term is defined by the normalized roughness of the current point, and the smoothness term is defined by the normalized distance between the point and its nearest neighbour point. In terms of pole-like object detection, first, a uniformed 2D grid is generated through projecting all the points to the XY-plane. The seed points of the pole-like objects are obtained by determining the x and y coordinates by the centres of the highest density cells of the grid and the z coordinate by the mean height of the point sets of each object. Finally, a DBSCAN based method is introduced to obtain the rest points of each pole-like object. Experimental results show that the proposed vegetation removal method achieves state-of-the-art results from both mobile LiDAR and airborne LiDAR data. The proposed pole-like object detection approach turns out to be very efficient.
Highlights
With the development of the laser scanning systems, and the increasing interest on three-dimensional city scene understanding and reconstruction, more and more efforts have been put on the research of object detection and recognition from LiDAR data
In this paper, a local roughness measure method is proposed to describe the points’ distribution difference, the points are classified into two groups, the vegetation are removed from LiDAR data
Since the mobile LiDAR data mainly focusing on the street scene, objects on the ground are usually interfered by the background vegetation points, which result in the ineffectiveness of those algorithms addressing indoor scene object detection (Paul and Ramesh, 1988; Christopher and Benjamin, 2001) or outdoor objects from clean background
Summary
With the development of the laser scanning systems, and the increasing interest on three-dimensional city scene understanding and reconstruction, more and more efforts have been put on the research of object detection and recognition from LiDAR data. Those methods with point or object classification algorithms based on local geometry information may not work under this situation and result in objects undetected and misrecognition. The neighbour points for a given point on a penetrable object will distribute randomly inside the spherical area Based on this idea, in this paper, a local roughness measure method is proposed to describe the points’ distribution difference, the points are classified into two groups, the vegetation are removed from LiDAR data.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.