This paper analyzes a dataset containing radio frequency (RF) measurements and Key Performance Indicators (KPIs) captured at 1876.6MHz with a bandwidth of 10MHz from an operational 4G LTE network in Nigeria. The dataset includes metrics such as RSRP (Reference Signal Received Power), which measures the power level of reference signals; RSRQ (Reference Signal Received Quality), an indicator of signal quality that provides insight into the number of users sharing the same resources; RSSI (Received Signal Strength Indicator), which gauges the total received power in a bandwidth; SINR (Signal to Interference plus Noise Ratio), a measure of signal quality considering both interference and noise; and other KPIs, all derived from three evolved node base stations (eNodeBs). After meticulous data cleaning, a subset of measurements from one serving eNB, spanning a 20-minute duration, was selected for deeper analysis. The PDCP DL Throughput, as a vital KPI metric, plays a paramount role in evaluating network quality and resource allocation strategies. Leveraging the high granularity of the data, the primary aim was to predict throughput. For this purpose, I compared the predictive capabilities of two machine learning models: Linear Regression and Random Forest. Metrics such as Mean Absolute Error (MAE), and Root Mean Squared Error (RMSE) were used to examine the models as they offer a comprehensive insight into the models’ accuracies. The comparative analysis highlighted the superior performance of the Random Forest model in predicting the PDCP DL Throughput. The insights derived from this research can potentially guide network engineers and data scientists in optimizing network performance, ensuring a seamless user experience. Furthermore, as the telecommunication industry advances towards the integration of 5G and beyond, the methodologies explored in this paper will be invaluable in addressing the increasingly complex challenges of future wireless networks.