Abstract

Segmentation plays an important role in medical imaging, a precise segmentation can significantly improve the accuracy of object detection and localization. Level set based model is robust in image segmentation, but the parameters of level set function are usually decided by empirical method, which discourages its application in medical area, because medical images are various and the users may not be familiar with parameters setting of level set method. In this paper, we present an automatic segmentation method based on variational level set formulation. This method is formulated by statistical measures and solved by using the Euler-Lagrange equation. The segmentation criteria of our method rely on structural similarities of the image, which are luminance, contrast, and correlation coefficients. These criteria are formulated into an energy function to maximize the structural difference between object and background in segmentation. The energy function is solved and implemented by using variational level set method. Unlike prevalent level set methods, the segmentation parameters of our approach are automatically decided by structural information of the image and updated during iteration, so our model is nonparametric. Moreover, our approach does not necessitate any training, nor any a priori assumption about probability density functions of statistical inference. Furthermore, our method is region-based without using gradients, and the parameters in our method are updated according to image information, so our method can significantly reduce computation costs in its numerical implementation. The segmentation results have shown that our method adequately captures the structural differences between object and background during segmentation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.