Abstract

This paper proposed an image retargeting quality assessment. In the proposed method, a deep convolution network is trained on the pixel displacement patterns of different image retargeting methods to produce a measure for evaluating the quality of output images. Also, the method extracts three other measures that assess the geometric changes of important objects in the image, the bending of block lines, and the extent of information loss during the retargeting process. The tests performed on two well-known databases, RetargetMe and CUHK, demonstrate the excellent performance, stability, and reliability of the proposed method compared to the existing methods. In this paper, we present a new method for evaluating the quality of retargeted images. The innovations of the proposed method are: Using a deep learning method to identify the important regions of the image that contain foreground objects and people Providing a quality evaluation measure, obtained by training a CNN with regression output on the pixel displacement patterns in different image retargeting methods Extracting the foreground objects in the original image and the corresponding objects in the retargeted image and determining the extent of geometric change in each object. Estimating the extent of information loss and distortion based on the extent to which the blocks of the original image have been bent. Estimating the quality score of the retargeted image using Gaussian process regression.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.