AbstractExisting image dehazing methods have made remarkable progress. However, they generally perform poorly on images with dense haze, and often suffer from unsatisfactory results with detail degradation or color distortion. In this paper, we propose a density‐aware diffusion model (DADM) for image dehazing. Guided by the haze density, our DADM can handle images with dense haze and complex environments. Specifically, we introduce a density‐aware dehazing network (DADNet) in the reverse diffusion process, which can help DADM gradually recover a clear haze‐free image from a haze image. To improve the performance of the network, we design a cross‐feature density extraction module (CDEModule) to extract the haze density for the image and a density‐guided feature fusion block (DFFBlock) to learn the effective contextual features. Furthermore, we introduce an indirect sampling strategy in the test sampling process, which not only suppresses the accumulation of errors but also ensures the stability of the results. Extensive experiments on popular benchmarks validate the superior performance of the proposed method. The code is released in https://github.com/benchacha/DADM.
Read full abstract