Abstract

An IoT system often consists of many sensors to collect data in different aspects. Meanwhile, all these sensors describe the IoT system’s functional status, to which it belongs. The correlations between subsystems are always emphasized for a complex system that contains several IoT subsystems. At the same time, there are still no good ways to calculate these types of correlations since that (1) multiple sensors describe an IoT system as a matrix while the correlation between matrices cannot be calculated by the traditional methods (i.e., vector ways such as Pearson correlation coefficient) and (2) AI methods such as neural networks were introduced to resolve this problem; however, these black-box approaches cannot explain the mathematical mechanisms, and lots of memory or time are consumed. This paper proposed a novel approach named the matrix-oriented correlation computing method (MOCC) to learn the correlations between IoT systems. The critical problem of this proposed method is calculating the correlation between two curved surfaces, which are modeled as matrices, since an IoT system often contains many sensors which characterize different aspects of this system and continuously generate data in time series. By our MOCC method, the correlation or interaction between any two subsystems can be accurately measured, which means that we can predict the state of a system by its most important related system. Missing data value prediction based on our MOCC method is also presented in this paper. We verified the efficiency and effect of our proposed method via a satellite, a typical IoT system consisting of massive sensors, and the experimental result was proved to outperform existing methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call