Abstract

Lossless image compression algorithms used in the prepress workflow suffer from the disadvantage that only moderate compression ratios can be achieved. Most lossy compression schemes achieve much higher compression ratios but there is no easy way to limit difference they introduce. Near- lossless image compression schemes are based on lossless techniques, but they give an opportunity to put constraints on the unavoidable pixel loss. The constraints are usually expressed in terms of differences within the individual CMYK separations and this error criterion does not match the human visual system. In this paper. we present a near- lossless image compression scheme which aims at limiting the pixel difference such as observed by the human visual system. It uses the subjectively equidistant CIEL*a*b*-space to express allowable color differences. Since the CMYK to CIEL*a*b* transform maps a 4D space onto a 3D space, singularities would occur resulting in a loss of the gray component replacement information; therefore an additional dimension is added. The error quantization is based on an estimated linearization of the CIEL*a*b* transform and on the singular value decomposition of the resulting Jacobian matrix. Experimental results on some representative CMYK test images show that the visual image quality is improved and that higher compression ratios can be achieved before the visual difference is detected by a human observer.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call