Abstract

Single-image blind deblurring for imaging sensors in the Internet of Things (IoT) is a challenging ill-conditioned inverse problem, which requires regularization techniques to stabilize the image restoration process. The purpose is to recover the underlying blur kernel and latent sharp image from only one blurred image. Under many degraded imaging conditions, the blur kernel could be considered not only spatially sparse, but also piecewise smooth with the support of a continuous curve. By taking advantage of the hybrid sparse properties of the blur kernel, a hybrid regularization method is proposed in this paper to robustly and accurately estimate the blur kernel. The effectiveness of the proposed blur kernel estimation method is enhanced by incorporating both the -norm of kernel intensity and the squared -norm of the intensity derivative. Once the accurate estimation of the blur kernel is obtained, the original blind deblurring can be simplified to the direct deconvolution of blurred images. To guarantee robust non-blind deconvolution, a variational image restoration model is presented based on the -norm data-fidelity term and the total generalized variation (TGV) regularizer of second-order. All non-smooth optimization problems related to blur kernel estimation and non-blind deconvolution are effectively handled by using the alternating direction method of multipliers (ADMM)-based numerical methods. Comprehensive experiments on both synthetic and realistic datasets have been implemented to compare the proposed method with several state-of-the-art methods. The experimental comparisons have illustrated the satisfactory imaging performance of the proposed method in terms of quantitative and qualitative evaluations.

Highlights

  • Comprehensive blind deblurring experiments on both synthetic and realistic blurred images will be performed to verify the effectiveness of our proposed method

  • The synthetic dataset has been widely exploited as a benchmark dataset to evaluate the performance of blur kernel estimation

  • The non-blind deconvolution results in our proposed method will be achieved using the combination of the L1 -norm data-fidelity term and the TGV2 regularizer summarized in Algorithm 3

Read more

Summary

Background and Related Work

Single-image blind deblurring for imaging sensors has recently received increasing attention in modern imaging applications, e.g., the Internet of Things (IoT), astronomical imaging, biomedical imaging, computational photography and microscopy [1,2,3]. The purpose of single-image blind deblurring is to recover both k and L from only one blurred image B It is a challenging ill-conditioned inverse problem, since many different pairs k and L can lead to the same B [4]. To cope with the ill-conditioned nature of blind deblurring, many statistical priors learned from blur kernels and latent sharp images have been developed to regularize the restoration process. To guarantee the high-quality blind deblurring, this paper mainly focuses on the second type of method, i.e., estimating the blur kernel first and dealing with the corresponding non-blind deconvolution problem. Once the blur kernel is estimated, the blind deblurring problem (2) essentially becomes a non-blind image deconvolution. NLTV-regularized variational models can guarantee the highest-quality deconvolution because they take full advantage of the high degree of geometrical self-similarity that is inherent in natural images

Motivation and Contributions
Hybrid Regularized Blur Kernel Estimation
Sharp Edge Restoration
Blur Kernel Estimation
Robust Non-Blind Deconvolution
Update the Lagrange Multipliers
Experimental Settings
Experiments on Synthetically-Blurred Images
Methods
Experiments on a Large Blur Kernel
Experiments on Ocean Engineering
Experiments on More Realistic Blurred Images
Conclusions and Future Work
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.