Abstract

AbstractWhen transmitting and storing pictures or information, we hope that the picture information can occupy less space without losing the original information. Entropy coding is a lossless data compression coding that can encode according to the frequency of elements without losing information. Common entropy codes include Shannon coding, Huffman coding, arithmetic coding and so on. This paper will make a comparative analysis of two common coding methods, Huffman coding and arithmetic coding. Considering the correlation between coded symbol sequences, the probability value of the symbol sequence is used to replace the smaller probability value of a single symbol. The principle is applied to binary arithmetic coding to form a more effective method than traditional Huffman coding, which can shorten the average code length and make the amount of information of the code approach the entropy rate of the symbol, so as to significantly improve the data compression ratio of binary arithmetic coding experimental tests on different types of data show that the compression coding effect is good. The experimental results demonstrate the efficiency of the optimization method described in this paper.KeywordsHuffman codingArithmetic codingData compressionData decompression

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call