Abstract

Precise and efficient object boundary detection is the key for successful accomplishment of many imaging applications involving object segmentation or recognition. Blur-scale at a given image location represents the transition-width of the local object interface. Hence, the knowledge of blur-scale is crucial for accurate edge detection and object segmentation. In this paper, we present new theory and algorithms for computing local blur-scales and apply it for scale-based gradient computation and edge detection. The new blur-scale computation method is based on our observation that gradients inside a blur-scale region follow a Gaussian distribution with non-zero mean. New statistical criteria using maximal likelihood functions are established and applied for local blur-scale computation. Gradient vectors over a blur-scale region are summed to enhance gradients at blurred object interfaces while leaving gradients at sharp transitions unaffected. Finally, a blur-scale based non-maxima suppression method is developed for edge detection. The method has been applied to both natural and phantom images. Experimental results show that computed blur-scales capture true blur extents at individual image locations. Also, the new scale-based gradient computation and edge detection algorithms successfully detect gradients and edges, especially at the blurred object interfaces.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.