Abstract

In order to reduce unnecessary data transmissions from Internet of Things (IoT) sensors, this article proposes a multivariate-time-series-prediction-based adaptive data transmission period control (PBATPC) algorithm for IoT networks. Based on the spatio-temporal correlation between multivariate time-series data, we developed a novel multivariate time-series data encoding scheme utilizing the proposed time-series distance measure <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$\textit {ADMWD}$ </tex-math></inline-formula> . Composed of two significant factors for a multivariate time-series prediction, i.e., the absolute deviation from the mean (ADM) and the weighted differential (WD) distance, the <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$\textit {ADMWD}$ </tex-math></inline-formula> considers both the time distance from a prediction point and a negative correlation between the time-series data concurrently. Utilizing the convolutional neural network (CNN) model, a subset of IoT sensor readings can be predicted from encoded multivariate time-series measurements, and we compared the predicted sensor values with actual readings to obtain the adaptive data transmission period. Extensive performance evaluations show a substantial performance gain of the proposed algorithm in terms of the average power reduction ratio (approximately 12%) and average data reconstruction error (approximately 8.32% MAPE). Finally, this article also provides a practical implementation of the proposed PBATPC algorithm via the HTTP protocol under the IEEE 802.11-based WLAN network.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call