Abstract

It has become increasingly important for industries to promote digital transformation by utilizing 5G and industrial internet of things (IIoT) to improve productivity. To protect IIoT application performance (work speed, productivity, etc.), it is often necessary to satisfy quality of service (QoS) requirements precisely. For this purpose, there is an increasing need to automatically identify the root causes of radio-quality deterioration in order to take prompt measures when the QoS deteriorates. In this paper, a method for identifying the root cause of 5G radio-quality deterioration is proposed that uses machine learning. This Random Forest based method detects the root cause, such as distance attenuation, shielding, fading, or their combination, by analyzing the coefficients of a quadratic polynomial approximation in addition to the mean values of time-series data of radio quality indicators. The detection accuracy of the proposed method was evaluated in a simulation using the MATLAB 5G Toolbox. The detection accuracy of the proposed method was found to be 98.30% when any of the root causes occurs independently, and 83.13% when the multiple root causes occur simultaneously. The proposed method was compared with deep-learning methods, including bidirectional long short-term memory (bidirectional-LSTM) or one-dimensional convolutional neural network (1D-CNN), that directly analyze the time-series data of the radio quality, and the proposed method was found to be more accurate than those methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call