Abstract

Abstract. In this paper we propose a virtual control point based method for the registration of photogrammetry and computed tomography (CT) data. Because of the fundamentally different two data sources, conventional registration methods, such as manual control points registration or 3D local feature-based registration, are not suitable. The registration objective of our application is about 3D reconstructions of gyroscopes, which contain abundant geometric primitives to be fitted in the point clouds. In the first place, photogrammetry and CT scanning are applied, respectively, for 3D reconstructions. Secondly, our workflow implements a segmentation after obtaining the surface point cloud from the complete CT volumetric data. Then geometric primitives are fitted in this point cloud benefitting from the less complex cluster segments. In the next step, intersection operations of the parametrized primitives generates virtual points, which are utilized as control points for the transformation parameters estimation. A random sample consensus (RANSAC) method is applied to find the correspondences of both virtual control point sets using corresponding descriptors and calculates the transformation matrix as an initial alignment for further refining the registration. The workflow is invariant to pose, resolution, completeness and noise within our validation process.

Highlights

  • Sensor fusion is an important topic in many fields, because in real applications it is quite often difficult for a single sensor alone to provide the complete desired information

  • The object is chosen due to its rich geometric primitives design while few evident corner points within the rounded edges could be manually picked for Computed Tomography (CT) and photogrammetry point cloud registration

  • iterative closest point (ICP) is mainly an iterative loop to calculate the nearest neighbors of two sets of point clouds, and usually good results can be obtained under good initial matching conditions

Read more

Summary

Introduction

Sensor fusion is an important topic in many fields, because in real applications it is quite often difficult for a single sensor alone to provide the complete desired information. The combination of photogrammetry and CT has been discussed frequently in the medical field (Bolandzadeh et al, 2013), has not yet been applied often in TH digitization applications (Zhan et al, 2020). This paper mainly discusses a new registration method of photogrammetric and CT point clouds in such TH applications, where complete models are required. A point cloud is chosen as the common representation for photogrammetric surface data and CT volumetric data, due to the fact, that point cloud registration is an ongoing topic in the field of photogrammetry and computer vision, with various methods being put forward. The most frequently applied method is the iterative closest point (ICP) registration (Besl and McKay, 1992) or it’s variants, which iteratively calculates the discrepancy of the overlap between two point clouds. Methods based on automatic 3D features extraction such as the fast point feature histogram (FPFH) (Rusu et al, 2009) or nor-

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call