With the increasing scale of cloud computing applications of next-generation embedded systems, a major challenge that domain scientists are facing is how to efficiently store and analyze the vast volume of output data. Compression can reduce the amount of data that needs to be transferred and stored. However, most of the large datasets are in floating-point format, which exhibits high entropy. As a result, existing lossless compressors cannot provide enough performance for such applications. To address this problem, we propose a total variation reduction method for improving the compression ratio of lossless compressors (namely, FPC + and FPZIP + ), which employs a median-based hyperplane to precondition the data. In particular, we first try to exploit the space-filling curve (SFC), a well-known technique to preserve data locality for a multi-dimensional dataset. We show and explain why a raw SFC, such as Hilbert and Z-order curves, cannot improve the compression ratio. Then, we explore the opportunity and theoretical feasibility of the proposed total variation reduction-based algorithm. The experiment results show the effectiveness of the proposed method. The compression ratios are improved up to 48.2% (20.6% on average) for FPZIP and 42.4% (18.4% on average) for FPC. Moreover, through observing the time composition of the proposed method, it is found that the median finding holds a high percentage of the execution time. Hence, we further introduce an approximate median finding algorithm, providing a linear-time overhead reduction scheme. The experiment results clearly demonstrate that this algorithm reduces execution time by an average of 56.7% and 40.7% compared to FPC + and FPZIP + , respectively.
Read full abstract