Abstract

Image segmentation is a fundamental step in many applications such as image editing, medical image analysis and processing. It is quite common to use graph cuts for image segmentation in recent years. Tang, Gorelick, Veksler, and Boykov proposed a new image segmentation model which uses the appearance overlap on unnormalized histograms and graph cut framework. Their model is highly effective for interactive segmentation but is prone to isolated points. To avoid that problem, we propose an effective interactive image segmentation method, that is appropriately incorporating geodesic distance information, appearance overlap information, and edge information together into the well-known graph-cut framework. Rather than a simple union of these information, the respective strengths of each information term can be tuned adaptively in our method. We utilize the user’s scribbles to obtain the estimated foreground/background color models via fast kernel density estimation, and then get the appearance overlap intricateness according to the inferred color models. By taking comprehensive advantage of the geodesic distance and the global appearance overlap color clues, our method requires less user effort and achieves higher accuracy of segmentation than the latest interactive segmentation techniques, such as Geodesic Graph Cut, GrabCut in One Cut, Semi-Supervised Normalized Cuts, and Convexity Shape Prior for Binary Segmentation, as shown in our experiments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.