Abstract

Point cloud registration is a critical task in many 3D computer vision studies, aiming to find a rigid transformation that aligns one point cloud with another. In this paper, we propose PANet-a Point-Attention based multi-scale feature fusion network for partially overlapping point cloud registration. This study aims to investigate whether multi-scale features are more effective in improving the precision of alignment compared to fixed-scale local features. PANet comprises two core components: a multi-branch feature extraction module that extracts local features at different scales in parallel, and a Point-Attention Module that learns an appropriate weight for each branch and then fuse these multi-scale features by weighted combination to enhance the representation ability of features. At the end of the network, four hidden layers are used to obtain the rigid transformation from the source point cloud to the template point cloud. Experiments on the synthetic ModelNet40 dataset demonstrate that PANet outperforms state-of-the-art performance in terms of both alignment precision and robustness against noise. PANet also exhibits strong generalization ability on real-world Stanford 3D and ICL-NUIM datasets. In addition, the computational complexity of our model compared to previous works is also evaluated. The results and ablation studies demonstrate that multi-scale fused local features are better at improving registration accuracy than fixed-scale local features. The findings may inspire future research in related fields and contribute to the development of new ideas and approaches.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.