Abstract

The emerging applications of the Internet of Things (IoT) in various sectors generate a gigantic amount of continuous time-series data. As IoT based sensors nodes are very energy-constrained devices, continuous transmission of huge amounts of sensor data from IoT nodes is challenging but inevitable. It requires massive energy consumption. In this paper, we present an energy-saving pattern by predicting the periodic sensor data after analyzing the continuous transmission data from IoT nodes (at the server beforehand). Our system consists of an IoT based sensor network and a data processing unit. In the sensor network, two types of sensor data, such as temperature and humidity, are collected from four different nodes and sent to the processing unit (integrated on Raspberry Pi). In the processing unit, we worked with two machine learning models-Autoregressive Integrated Moving Average (ARIMA) and Long Short Term Memory (LSTM), which are applied separately on the data of four nodes to make a prediction of future values. A comparative analysis of two models is done in terms of different evaluation metrics where the accuracy of LSTM outperforms ARIMA. Finally, it is shown that with the prediction accuracy of both models, the efficient energy-saving pattern is a chieved by effectively reducing the continuous transmission of data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call