Abstract

Accurate wireless channel quality prediction over 4G LTE networks continues to be an important problem as future channel predictions are widely leveraged to meet the strict requirements of applications such as 360-degree video, ARlVR, and online games. The availability of large amounts of wireless channel data, the increase in computational power and the advancements in the field of machine learning provide us the opportunity to design learning-based approaches to address the channel quality prediction problem. In this paper, we design discriminative sequence-to-sequence probabilistic graphical models, specifically sparse Gaussian Conditional Random Fields (GCRF) models to accurately predict future channel quality variations in 4G LTE networks based on past channel quality data. In contrast to prior work that has primarily focused on designing parsimonious Markovian models or computationally-intensive deep learning models, the sparse GCRF models designed here provide superior performance while being highly interpretable and computationally efficient, thus making them an ideal choice for practical deployment. To validate the efficacy of our sparse GCRF model, we compare its performance (i.e., root mean squared error and mean absolute error) with i) linear regression and ii) ARIMA and iii) the state-of-the-art deep learning model on real-world 4G LTE channel quality data collected under varying levels of user mobility for two cellular operators and observe that the GCRF model provides significantly higher performance improvement.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call