Abstract

This study aims to implement the Shannon-fano Adaptive data compression algorithm on characters as input data. This study also investigates the data compression ratio, which is the ratio between the number of data bits before and after compression. The resulting program is tested by using black-box testing, measuring the number of character variants and the number of types of characters to the compression ratio, and testing the objective truth with the Mean Square Error (MSE) method. The description of the characteristics of the application made is done by processing data in the form of a collection of characters that have different types of characters, variants, and the number of characters. This research presents algorithm that support the steps of making adaptive Shannon-fano compression applications. The length of the character determines the variant value, compression ratio, and the number of input character types. Based on the results of test results, no error occurs according to the comparison of the original text input and the decompression results. A higher appearance frequency of a character causes a greater compression ratio of the resulting file; the analysis shows that a higher number of types of input characters causes a lower compression ratio, which proves that the proposed method in real-time data compression improves the effectiveness and efficiency of the compression process.

Highlights

  • Most of the data management has been performed routinely using computers

  • The research concludes that the method implemented in the microcontroller embedded system generates a higher ratio of data compression using the adaptive Shannon-fano algorithm, which defined by the comparison of compression ratios after variant data values determined and the number of characters types inputted

  • The data compression process using the Adaptive Shannon-fano algorithm is more effective than the non-adaptive Shannon-fano algorithm, which attested by Mean Square Error (MSE) of data compression error

Read more

Summary

Introduction

Most of the data management has been performed routinely using computers. Along with developments in the field of telecommunications, the amount of information collected, processed, and prepared to be accessed via the internet is increasing significantly. Recent advances in information technology are causing massive amounts of data generated every second. Data storage and transmission tend to increase to an extraordinary level [1]. The demand for data exchange between users and the number of users causes the increasing necessity for data exchange channels (bandwidth) and data storage media. Investment to satisfy those needs is not meager. One of the efforts to reduce the cost of providing infrastructure so that

Objectives
Methods
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call