Abstract

研究了两幅和多幅深度图像的自动配准问题.在配准两幅深度图像时,结合二维纹理图像配准深度图像,具体过程是:首先,从扫描数据中提取纹理图像,特别地,针对不包含纹理图像的扫描数据提出了一种根据深度图像直接生成纹理图像的方法;然后,基于SIFT(scale-invariant feature transform)特征提取纹理图像中的兴趣像素,并通过预过滤和交叉检验兴趣像素等方法从中找出匹配像素对的候选集;之后,使用RANSAC(random sample consensus)算法,根据三维几何信息的约束找出候选集中正确的匹配像素对和相对应的匹配顶点对,并根据这些匹配顶点对计算出两幅深度图像间的刚体置换矩阵;最后,使用改进的ICP(iterative closest point)算法优化这一结果.在配准多幅深度图像时,提出了一种快速构建模型图的方法,可以避免对任意两幅深度图像作配准,提高了配准速度.该方法已成功应用于多种文物的三维逼真建模.;This paper presents a rapid method to align large sets of 3D scanned data automatically. The method incorporates the technique of image registration into the pair-wise registration. Firstly, it retrieves two texture images from the scanned data to align. A method is proposed to generate the texture image from the range image when scanned data do not contain the texture information. Secondly, it detects the features using SIFT (scale-invariant feature transform) on texture images, and a set of potential corresponding pixels is selected by means of pre-filter and cross validation. Then a matching algorithm, based on RANSAC (random sample consensus) algorithm, is applied to specify the matching pixel pairs between two images. All matches obtained are mapped to 3D space and used to estimate the rigid transformation. Finally, a modified ICP (iterative closest point) algorithm is applied to refine the result. The paper also presents a method to create model graph rapidly for multi-view registration which avoids aligning all pairs of range images. This reconstruction technique achieves a robust and high performance in the application of automatic rebuilding 3D models of culture heritages.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.