Abstract

Existing image dehazing algorithms typically rely on a two-stage procedure. The medium transmittance and lightness are estimated in the first stage, and the scene radiance is recovered in the second by applying the simplified Koschmieder model. However, this type of unconstrained dehazing is only applicable to hazy images, and leads to untoward artifacts in haze-free images. Moreover, no algorithm that can automatically detect the haze density and perform dehazing on an arbitrary image has been reported in the literature to date. Therefore, this paper presents an automated dehazing system capable of producing satisfactory results regardless of the presence of haze. In the proposed system, the input image simultaneously undergoes multiscale fusion-based dehazing and haze-density-estimating processes. A subsequent image blending step then judiciously combines the dehazed result with the original input based on the estimated haze density. Finally, tone remapping post-processes the blended result to satisfactorily restore the scene radiance quality. The self-calibration capability on haze conditions lies in using haze density estimate to jointly guide image blending and tone remapping processes. We performed extensive experiments to demonstrate the superiority of the proposed system over state-of-the-art benchmark methods.

Highlights

  • This paper presents a novel approach for dehazing a single image, regardless of haze conditions

  • The proposed automated dehazing system (AUDS) is equipped with a pseudo-cognitive function realized by the HDE to perceive the image haze density

  • The input image and its dehazed version obtained via multiscale-fusion-based dehazing are combined using image blending and post-processed with tone remapping

Read more

Summary

Introduction

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. Outdoor imaging is subject to environmental effects, such as lighting and weather conditions. Captured images occasionally exhibit inconvenient characteristics (for example, faint color, contrast reduction, and loss of details), posing practical difficulties for image processing algorithms deployed in high-level vision applications. In real-world scenarios, light scattering and diffusion in the turbid atmosphere are probably the most common causes of image degradation. Researchers widely refer to these degradation sources as haze, which comprises microscopic aerosols occurring naturally or originating from industrial emissions. Pei et al [1] investigated the effects of image degradation on object recognition and discovered that the accuracy decreased with increasing haze

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call