Abstract
The defocus blur concept adds an artistic effect and enables an enhancement in the visualization of image scenery. Moreover, some specialized computer vision fields, such as object recognition or scene restoration enhancement, might need to perform segmentation to separate the blurred and non-blurred regions in partially blurred images. This study proposes a sharpness measure comprised of a Local Binary Pattern (LBP) descriptor and Pulse Coupled Neural Network (PCNN) component used to implement a robust approach for segmenting in-focus regions from out of focus sections in the scene. The proposed approach is very robust in the sense that the parameters of the model can be modified to accommodate different settings. The presented metric exploits the fact that, in general, local patches of the image in blurry regions have less prominent LBP descriptors than non-blurry regions. The proposed approach combines this sharpness measure with the PCNN algorithm; the images are segmented along with clear regions and edges of segmented objects. The proposed approach has been tested on a dataset comprised of 1000 defocused images with eight state-of-the-art methods. Based on a set of evaluation metrics, i.e., precision, recall, and F1-Measure, the results show that the proposed algorithm outperforms previous works in terms of prominent accuracy and efficiency improvement. The proposed approach also uses other evaluation parameters, i.e., Accuracy, Matthews Correlation Coefficient (MCC), Dice Similarity Coefficient (DSC), and Specificity, to assess better the results obtained by our proposal. Moreover, we adopted a fuzzy logic ranking scheme inspired by the Evaluation Based on Distance from Average Solution (EDAS) technique to interpret the defocus segmentation integrity. The experimental outputs illustrate that the proposed approach outperforms the referenced methods by optimizing the segmentation quality and reducing the computational complexity.
Highlights
In optical imaging system, out of focus in a digital image is the result of the defocused blur region
CONTRIBUTIONS This paper proposes a novel, efficient and accurate approach based on a sharpness metric called Local Binary Patterns (LBP) and a pulse-synchronous mechanism known as Pulse Coupled Neural Network (PCNN) for resolving the defocus blur segmentation problem
Inspired by the feature transform segmentation method for Low Depth of Field (LDOF) images, we turned our attention to PCNN and LBP-based schemes to extract the focused region of LDOF images
Summary
Out of focus in a digital image is the result of the defocused blur region. An accurate and efficient detection of the focused and out of focus regions is vital on various perspectives: a) to avoid costly post-processing tasks such as deconvolution (defocused region) [11]; b) to detect the blurred background in digital imaging comprising image refocusing and de-blurring, estimation of image depth and analysis of the image quality. The major aim of defocus blur segmentation is to separate sharp and blurred regions for facilitating the post-processing mentioned above tasks This explicit research problem is adopted ; a novel and hybrid approach based on Pulse Coupled Neural Network (PCNN) and Local Binary Patterns (LBP) are presented. A. CONTRIBUTIONS This paper proposes a novel, efficient and accurate approach based on a sharpness metric called Local Binary Patterns (LBP) and a pulse-synchronous mechanism known as Pulse Coupled Neural Network (PCNN) for resolving the defocus blur segmentation problem. The proposed approach yields significantly accurate results in less computational time and working for the numerous defocus conditions, as noticeable from our experimental defocus segmentation results
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.