Abstract

Abstract One of the main challenges in the non-rigid surface matching is to match complex surfaces with absence of salient landmarks (marker-less) and salient structures (structure-less). We propose an accurate non-rigid surface registration method, called DSMM, to match complex surfaces based on a dense point-to-point correspondence alignment. The key idea of our approach is to model the correspondences on surfaces by using Student’s-t mixture model and represent local spatial structures via Dirichlet distribution and the directional springs. Firstly, we formulate the problem of alignment of two point sets as a probability density estimate, modeling one set as Student’s-t mixture model centroids, and the other one as observation data. We subsequently incorporate spatial representations of vertices on the surfaces into the prior probability of the finite Student’s-t mixture model by exploiting the Dirichlet distribution and Dirichlet law. We later explicitly add an additional structure regularization to get an approximate isometric and near-conformal transformation. Finally, we obtain closed-form solutions of registration parameters using Expectation Maximization (EM) framework, leading to a computationally efficient registration method. We compare DSMM with other state-of-the-art direct point-based non-rigid surface matching methods based on finite mixture models on artificial shapes with large deformation and real complex shapes from various segmented brain structures. DSMM demonstrates its statistical accuracy and robustness, outperforming the competing

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.