Abstract
Many digital images contain blurred regions which are caused by motion or defocus. The defocus blur reduces the contrast and sharpness detail of the image. Automatic blur detection and segmentation is an important and challenging task in the field of Computer vision “e.g. object recognition and scene interpretation” that requires the extraction and processing of large amounts of data from sharp areas of the image. Therefore, the sharp and blur areas must be segmented separately to assure that the information is extracted from the sharp regions. The existing techniques on blur detection and segmentation have taken a lot of effort and time to design metric maps of local clarity. Furthermore, these methods have various limitations “i.e. low accuracy rate in noisy images, detecting blurred smooth and sharp smooth regions, and high execution cost”. Therefore, there is a dire necessity to propose a method for the detection and segmentation of defocus blur robust to the aforementioned limitations. In this paper, we present a novel defocus blur detection and segmentation algorithm, “Local Triplicate Co-occurrence Patterns” (LTCoP) for the separation of in-focus and out-of-focus regions. It is observed that the fusion of extracted higher and lower patterns of LTCoP produces far better results than the others. To test the effectiveness of our algorithm, the proposed method is compared with several state-of-the-art techniques over a large number of sample images. The experimental results show that the proposed technique obtains comparative results with state-of-the-art methods and offers a significant high-speed advantage over them. Therefore, we argue that the proposed method can reliably be used for defocus blur detection and segmentation in high-density images.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.