Abstract
Three-dimensional active shape models use a set of annotated volumes to learn a shape model. Using unique landmarks to define the surface models in the training set, the shape model is able to learn the expected shape and variation modes of the segmentation. This information is then used during the segmentation process to impose shape constraints. A relevant problem in which these models are used is the segmentation of the left ventricle in 3D MRI volumes. In this problem, the annotations correspond to a set of contours that define the LV border at each volume slice. However, each volume has a different number of slices (thus, a different number of landmarks), which makes model learning difficult. Furthermore, motion artifacts and the large distance between slices make interpolation of voxel intensities a bad choice when applying the learned model to a test volume. These two problems raise the following questions: (1) how can we learn a shape model from volumes with a variable number of slices? and (2) how can we segment a test volume without interpolating voxel intensities between slices? This paper provides an answer to these questions by proposing a framework to deal with the variable number of slices in the training set and a resampling strategy for the test phase to segment the left ventricle in cardiac MRI volumes with any number of slices. The proposed method was evaluated on a public database with 660 volumes of both healthy and diseased patients, with promising results.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.