Abstract

A compression algorithm is proposed in this paper for reducing the size of sensor data. By using a dictionary-based lossless compression algorithm, sensor data can be compressed efficiently and interpreted without decompressing. The correlation between redundancy of sensor data and compression ratio is explored. Further, a parallel compression algorithm based on MapReduce [1] is proposed. Meanwhile, data partitioner which plays an important role in performance of MapReduce application is discussed along with performance evaluation criteria proposed in this paper. Experiments demonstrate that random sampler is suitable for highly redundant sensor data and the proposed compression algorithms can compress those highly redundant sensor data efficiently.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call