Abstract

Internet of Things (IoT) has emerged as a result of the explosive growth of smart and internet-enabled devices that integrated our physical world into the virtual one. These ever-growing numbers of devices gather a huge quantity of data that contains very useful data. However, the gathered data contains highly redundant data for a variety of reasons. Redundant data is desirable for its positive impact as it is used to improve the accuracy and reliability of sensed data and make the network fault-tolerant. It also leads to different issues in terms of the network’s longevity, data processing cost, network bottleneck, delay in data throughput, and network congestion and contention. This paper presents a spatial redundancy reduction approach for IoT data to reduce the issues of redundant data without compromising the accuracy and reliability of data. In the proposed approach, spatial redundancy reduction is performed through two algorithms. The first algorithm is Correlation Tree Construction (CTC), which is constructed based on the spatial correlation of sensor nodes. Based on the correlation tree, the second algorithm is Data Fusion (DF) to carry out data fusion for correlated sensor nodes to maintain the accuracy and reliability of data and reduce redundancy. The simulation of the proposed approach is carried out using the Cooja simulator and Python. The obtained results demonstrate that the proposed approach shows efficiency in terms of spatial redundancy reduction, accuracy, and integrity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call