Abstract

Burrows-Wheeler Transform, followed by Move-To-Front Transform, are often used transformation techniques in data compression. They may reduce the information entropy of the input sequence, which becomes more compressible in this way. This paper suggests an alternative, a Prediction-based Move-To-Front Transform, which may replace the aforementioned transformations. According to the context, consisting of a few already seen symbols, the Prediction-based Move-To-Front Transform selects an appropriate ordered domain of symbols to achieve a better match with the currently transforming symbol. Freeman chain code in four and eight directions, Three-Orthogonal chain code, and Vertex Chain Code were used for experiments. We confirmed that the proposed approach, when using an appropriate length of context, reduces the information entropy to a similar extent as the Burrows-Wheeler Transform followed by the Move-To-Front Transform on chain code data. Both approaches led to a very similar compression efficiency on 32 testing shapes when an arithmetic coder was used in the final stage. The proposed approach turned out to be more efficient when longer chain code sequences were used, obtained by merging all the testing chain codes of the same type.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call