Abstract

The histogram equalization approach, which is employed for image enhancement, reduces the number of pixel intensities, resulting in detail loss and an unnatural impression. This research proposes a strategy to improve the contrast of an image based on its nature. The images' statistical parameters mean, median and kurtosis are extracted and utilized to classify them into uniform and non-uniform background images. Initially, the image is decomposed using a multilevel decomposition based on the l1−l0 minimization model to extract its significant edge information. Later, the retrieved edge information is employed in proper histogram equalization to produce an improved result. Variational histogram equalization is proposed here to overcome the problem of over-amplification and artifacts in the homogeneous zone caused by histogram spikes in the uniform background images. Non-uniform background images are enhanced via two-dimensional histogram equalization, which takes advantage of the joint occurrences of edge information and pixel intensities in the low contrast image. The proposed technique is tested on the five databases: CSIQ, TID2013, LOL, DRESDEN, and FLICKR. SD, CII, DE, NIQE, and AMBE are the performance metrics used to validate the algorithm's effectiveness. Experimental analysis shows that the proposed technique outperforms the other algorithms, including deep learning architectures in high CII, SD, DE, and low NIQE values.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.