Abstract

Tone-mapping technology aims to transform High Dynamic Range (HDR) images into standard dynamic range images for visualizing on standard screens. However, various visual artifacts will inevitably be induced during the tone-mapping process. To quantify the visual quality degradation for Tone-Mapped Images (TMIs) accurately and automatically, a robust blind TMI quality measurement method is developed by analysis of low-level and high-level perceptual characteristics. Specifically, for low-level visual features, considering that the tone-mapping operator is prone to destroy the local image contrast, which conveys vital structural information, we resort to the joint statistical features based on gradient map and Laplacian of Gaussian response to portray the local image contrast variation in TMIs. In addition, Chromatic Local Binary Pattern (CLBP) is leveraged to measure the colorfulness degradation over four chromatic descriptor maps. For high-level quality-aware features, deep-learned features are utilized by a pre-trained convolutional neural network to characterize the semantics variation. Finally, the extracted low-level and high-level features are combined and mapped into an overall quality score by regression function. Extensive experiments show that our metric achieves a higher performance over two publicly available TMI benchmark databases.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call