Haze contamination severely degrades the quality and accuracy of optical remote sensing (RS) images, including hyperspectral images (HSIs). Currently, there are no paired benchmark datasets containing hazy and haze-free scenes in HSI dehazing, and few studies have analyzed the distributional properties of haze in the spatial and spectral domains. In this paper, we developed a new hazy synthesis strategy and constructed the first hyperspectral dehazing benchmark dataset (HyperDehazing), which contains 2000 pairs synthetic HSIs covering 100 scenes and another 70 real hazy HSIs. By analyzing the distribution characteristics of haze, we further proposed a deep learning model called HyperDehazeNet for haze removal from HSIs. Haze-insensitive longwave information injection, novel attention mechanisms, spectral loss function, and residual learning are used to improve dehazing and scene reconstruction capability. Comprehensive experimental results demonstrate that the HyperDehazing dataset effectively represents complex haze in real scenes with synthetic authenticity and scene diversity, establishing itself as a new benchmark for training and assessment of HSI dehazing methods. Experimental results on the HyperDehazing dataset demonstrate that our proposed HyperDehazeNet effectively removes complex haze from HSIs, with outstanding spectral reconstruction and feature differentiation capabilities. Furthermore, additional experiments conducted on real HSIs as well as the widely used Landsat-8 and Sentinel-2 datasets showcase the exceptional dehazing performance and robust generalization capabilities of HyperDehazeNet. Our method surpasses other state-of-the-art methods with high computational efficiency and a low number of parameters.
Read full abstract