Abstract

Active shape models (ASMs) are widely used for applications in the field of image segmentation. Building an ASM requires to determine point correspondences for input training data, which usually results in a set of landmarks distributed according to the statistical variations. State-of-the-art methods solve this problem by minimizing the description length of all landmarks using a parametric mapping of the target shape (e.g. a sphere). In case of models composed of multiple sub-parts or highly non-convex shapes, these techniques feature substantial drawbacks. This article proposes a novel technique for solving the crucial correspondence problem using non-rigid image registration. Unlike existing approaches the new method yields more detailed ASMs and does not require explicit or parametric formulations of the problem. Compared to other methods, the already built ASM can be updated with additional prior knowledge in a very efficient manner. For this work, a training set of 3-D kidney pairs has been manually segmented from 41 CT images of different patients and forms the basis for a clinical evaluation. The novel registration based approach is compared to an already established algorithm that uses a minimum description length (MDL) formulation. The presented results indicate that the use of non-rigid image registration to solve the point correspondence problem leads to improved ASMs and more accurate segmentation results. The sensitivity could be increased by approximately 10%. Experiments to analyze the dependency on the user initialization also show a higher sensitivity of 5–15%. The mean squared error of the segmentation results and the ground truth manually classified data could also be reduced by 20–34% with respect to varying numbers of training samples.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.