The rapid expansion of 5G networks has revolutionized mobile communication by offering unprecedented speed, low-latency connections, and the ability to support vast numbers of connected devices. However, these advancements bring new challenges in maintaining consistent and reliable signal strength, critical for ensuring optimal Quality of Service (QoS). Traditional models, such as ARIMA, Random Forest (RF), and K-means clustering, struggle to capture the complex, nonlinear, and dynamic behaviour of 5G networks, leading to suboptimal prediction accuracy. In this study, we propose a novel hybrid model, Clustered Temporal Memory Networks (CTMN), which integrates DBSCAN clustering with Long Short-Term Memory (LSTM) networks to improve signal strength prediction in mobile networks. The CTMN model combines DBSCAN's ability to handle spatial variability and outliers in 5G data, combined with LSTM's capacity for modelling long-term dependencies and nonlinear time-series patterns. Our empirical analysis demonstrates that CTMN outperforms traditional methods, achieving up to a 20.82% improvement in prediction accuracy across key performance metrics, including Mean Absolute Error (MAE), Mean Squared Error (MSE), and Root Mean Squared Error (RMSE). These findings indicate that CTMN provides a scalable, robust solution for enhancing signal strength prediction and optimizing network performance in next-generation mobile networks.
Read full abstract