Abstract

Power consumption is a key challenge for LTE-Advanced or future 5G mobile devices. Prediction of control channel signaling messages during an active connection with the network is a promising technique to improve the energy performance of LTE-A mobile devices and will also apply to future 5G devices due to the similarities between LTE-A and 5G New Radio (NR) standards in scheduling and controlling data transmissions. To reduce the prediction’s computational complexity and thus, the power consumed by the predictor itself, various dimensionality reduction algorithms are evaluated in this paper. Specific windowing and normalization pre-processing steps are proposed to support the heterogeneous binary and integer time series data of LTE control channel messages. Using a simple Feed Forward Neural Network (FFNN) predictor, four dimensionality reduction algorithms, Principal Component Analysis (PCA), Independent Component Analysis (ICA), Autoencoder (AE), and Deep AE, are compared with respect to the prediction accuracy. Experiments based on live network data show that PCA achieves the best performance and allows to successfully reduce LTE-A control channel time series data from 450 to 45 dimensions without degrading the prediction accuracy compared to a FFNN predictor without dimensionality reduction.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.