Abstract
Aiming at the problems of color bias and noise enhancement on the output image by the traditional low-light image enhancement algorithm, we propose a deep multi-scale pyramid hybrid network (DMPH-Net) algorithm that fuses attention mechanism and multi-scale pyramid is proposed in this paper. The algorithm uses DecomNet to decompose reflectance and light components, and uses multi-scale illumination attention module to fuse light and reflectance for the decomposed low-light reflectance to improve the realism and details of reflectance; by using five-layer feature pyramid and kernel selection in PRID-net module to achieve the fusion of contextual information between different scale feature layers, while effectively removing the enhanced of noise, and the added color loss effectively suppresses the color bias of the output image; using multi-scale cascading and channel attention mechanisms to adjust the illumination and fuse the illumination ratios, effectively enhancing the brightness, texture, and other feature information in the image. The DMPH-Net algorithm is experimentally validated on LOL and no-reference LIME, MEF, and NPE datasets, and the objective evaluation metrics PSNRuparrow , SSIMuparrow , LIPIPSdownarrow , and NIQEdownarrow are 23.3772, 0.8442, 0.1386, and 3.5966 on LOL dataset. The objective evaluation metrics NIQE on the reference-free datasets LIME, NPE, and MEF are 3.0735, 3.1711, and 2.9464, respectively. The experiments show that the DMPH-Net algorithm maintains high image details and textures in image enhancement and denoising, effectively enhances low-light images, and reduces noise and color bias of images. Compared with RUAS, UnRetinex-Net, and other enhancement algorithms, it improves in objective evaluation metrics PSNR, SSIM, LIPIPS, and NIQE.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.