Abstract

Restoration of fog images is important for the deweathering issue in computer vision. The problem is ill-posed and can be regularized within a Bayesian context using a probabilistic fusion model. This paper presents a multiscale depth fusion (MDF) method for defog from a single image. A linear model representing the stochastic residual of nonlinear filtering is first proposed. Multiscale filtering results are probabilistically blended into a fused depth map based on the model. The fusion is formulated as an energy minimization problem that incorporates spatial Markov dependence. An inhomogeneous Laplacian-Markov random field for the multiscale fusion regularized with smoothing and edge-preserving constraints is developed. A nonconvex potential, adaptive truncated Laplacian, is devised to account for spatially variant characteristics such as edge and depth discontinuity. Defog is solved by an alternate optimization algorithm searching for solutions of depth map by minimizing the nonconvex potential in the random field. The MDF method is experimentally verified by real-world fog images including cluttered-depth scene that is challenging for defogging at finer details. The fog-free images are restored with improving contrast and vivid colors but without over-saturation. Quantitative assessment of image quality is applied to compare various defog methods. Experimental results demonstrate that the accurate estimation of depth map by the proposed edge-preserved multiscale fusion should recover high-quality images with sharp details.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call