Abstract
The active contour model is a widely used technique for automatic object contour extraction. Existing methods based on this model can perform with high accuracy, even in the case of complex contours, but challenging issues remain, like the need for precise contour initialization for high curvature boundary segments or the handling of cluttered backgrounds. To deal with such issues, this paper presents a salient object extraction method, the first step of which is the introduction of an improved edge map that incorporates edge direction as a feature. The direction information in the small neighborhoods of image feature points is extracted, and the images’ prominent orientations are defined for direction-selective edge extraction. Using such improved edge information, we provide a highly accurate shape contour representation, which we also combine with texture features. The principle of the paper is to interpret an object as the fusion of its components: its extracted contour and its inner texture. Our goal in fusing textural and structural information is twofold: it is applied for automatic contour initialization, and it is also used to establish an improved external force field. This fusion then produces highly accurate salient object extractions. We performed extensive evaluations, which confirm that the presented object extraction method outperforms parametric active contour models and achieves higher efficiency than the majority of the evaluated automatic saliency methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Circuits and Systems for Video Technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.