Abstract

Building high-quality correspondences is critical in the feature-based point cloud registration pipelines. However, existing single-sequence learning frameworks are difficult to accurately and adequately capture contextual information, leaving a large proportion of outliers between two low-overlap scenes. In this paper, we present a progressive guidance network (PG-Net) to gather rich contextual information and exclude outliers. Specifically, we design a novel iterative structure that exploits the inlier probabilities of correspondences to guide the classification of initial correspondences progressively. This structure can mitigate outlier effects with robust contextual information to obtain more accurate model estimation. In addition, to sufficiently capture contextual information, we propose a grouped dense fusion attention feature embedding module to enhance the representation of inliers and significant channel-spatial. Meanwhile, we propose a two-stage neural spectral matching module to compute the inlier probability of each correspondence and estimate a 3D transformation model in a coarse-to-fine manner. Experiments results on indoor and outdoor datasets using distinct 3D local descriptors demonstrate that our PG-Net surpasses state-of-the-art outlier removal methods. Especially compared to the recent outlier removal network PointDSC, our PG-Net improves the registration recall by 4.06% on the indoor dataset with the FPFH descriptor. Source code: https://github.com/changcaiyang/PG-Net.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.