Abstract

Active contours is a popular technique for image segmentation. However, active contour tend to converge to the closest local minimum of its energy function and often requires a close boundary initialization. We introduce a new approach that overcomes the close boundary initialization problem by reformulating the external energy term. We treat the active contour as a mean curve of the probability density function p(x). It moves to minimize the Kullback-Leibler (KL) divergence between p(x) and the probability density function derived from the image. KL divergence forces p(x) to “cover all image areas” and the uncovered areas are heavily penalized, which allows the active contour to go over the edges. Also we use deterministic annealing on the width of p(x) to implement a coarse-to-fine search strategy. In the limit, when the width of p(x) goes to zero, the KL divergence function converges to the conventional external energy term (which can be seen a special case) of active contours. Our method produces robust segmentation results from arbitrary initialization positions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call