Most of the existing Medium Access Control (MAC) layer protocols for the Internet of Things (IoT) model the traffic generated by each IoT device via random arrivals such as those in a Poisson process. Under this model, since it is implied that IoT device traffic cannot be predicted, only reactive MAC-layer protocols in which the network responds to the current traffic are viable. In contrast, recent work has demonstrated that the traffic generated by an individual IoT device can be predictable, thus enabling predictive network protocols at the MAC layer. In this paper, we investigate information-theoretic bounds on the predictability of IoT traffic of individual devices. To this end, first, we compare the performance achieved by the following state-of-the-art forecasters on individual IoT device traffic: Logistic Regression, Multi-Layer Perceptron (MLP), 1-Dimensional Convolutional Neural Network (1D CNN), and Long Short Term Memory (LSTM) as well as MLP under feature selection based on Analysis of Variance (ANOVA) and Auto-Correlation Function (ACF). Second, we quantify the gap between the performance of these forecasters against information-theoretic bounds as follows: For IoT devices that generate a fixed number of bits at each generation instance, we measure the gap between the forecasting accuracy and the information-theoretic bound established by Fano’s inequality on the probability of correct prediction. Our empirical results show that existing forecasting schemes perform close to the information-theoretic bound in this case. For IoT devices that generate a variable number of bits, we measure the gap between the Mean Square Error (MSE) and the estimation-theoretic counterpart to Fano’s inequality. Our empirical results show that the performance of existing forecasting schemes is far from the information-theoretic bound in this case. This work thus motivates the machine learning community to develop forecasting schemes that approach information-theoretic bounds. Furthermore, this work is expected to impact the development of predictive MAC-layer protocols that exploit these bounds.
Read full abstract