Abstract

Model-based segmentation (MBS) has been successfully used for the fully automatic segmentation of anatomical structures in medical images with well defined gray values due to its ability to incorporate prior knowledge about the organ shape. However, the robust and accurate detection of boundary points required for the MBS is still a challenge for organs with inhomogeneous appearance such as the prostate and magnetic resonance (MR) images, where the image contrast can vary greatly due to the use of different acquisition protocols and scanners at different clinical sites. In this paper, we propose a novel boundary detection approach and apply it to the segmentation of the whole prostate in MR images. We formulate boundary detection as a regression task, where a convolutional neural network is trained to predict the distances between a surface mesh and the corresponding boundary points. We have evaluated our method on the Prostate MR Image Segmentation 2012 challenge data set with the results showing that the new boundary detection approach can detect boundaries more robustly with respect to contrast and appearance variations and more accurately than previously used features. With an average boundary distance of 1.71 mm and a Dice similarity coefficient of 90.5%, our method was able to segment the prostate more accurately on average than a second human observer and placed first out of 40 entries submitted to the challenge at the writing of this paper.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.