Articles published on Radio access network
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
4379 Search results
Sort by Recency
- New
- Research Article
- 10.1016/j.comnet.2025.111727
- Dec 1, 2025
- Computer Networks
- Andrea Lacava + 6 more
How to Poison an xApp: Dissecting Backdoor Attacks to Deep Reinforcement Learning in Open Radio Access Networks
- New
- Research Article
- 10.1016/j.engappai.2025.112274
- Dec 1, 2025
- Engineering Applications of Artificial Intelligence
- Abdelrahim Ahmad + 3 more
Anomaly detection in offshore open radio access network using long short-term memory models on a novel artificial intelligence-driven cloud-native data platform
- New
- Research Article
- 10.1016/j.comnet.2025.111745
- Dec 1, 2025
- Computer Networks
- Jann Camilo Sánchez Huertas + 3 more
Review and classification of the use of SMs for energy saving in next-generation radio access networks
- New
- Research Article
- 10.3390/s25237206
- Nov 26, 2025
- Sensors
- Partemie-Marian Mutescu + 5 more
The evolution from fifth generation (5G) to sixth generation (6G) networks demands a paradigm shift from AI-assisted functionalities to AI-native orchestration, where intelligence is intrinsic to the radio access network (RAN). This work introduces two AI-based enablers for PHY-layer awareness: (i) a waveform classifier that distinguishes orthogonal frequency-division multiplexing (OFDM) and orthogonal time frequency space (OTFS) signals directly from in-phase/quadrature (IQ) samples, and (ii) a numerology detector that estimates subcarrier spacing, fast Fourier transform (FFT) size, slot duration, and cyclic prefix type without relying on higher-layer signaling. Experimental evaluations demonstrate high accuracy, with waveform classification achieving 99.5% accuracy and numerology detection exceeding 99% for most parameters, enabling robust joint inference of waveform and numerology features. The obtained results confirm the feasibility of AI-native spectrum awareness, paving the way toward self-optimizing, context-aware, and adaptive 6G wireless systems.
- New
- Research Article
- 10.3390/app152312487
- Nov 25, 2025
- Applied Sciences
- Mirosław Klinkowski + 1 more
Accurate prediction of maximum flow latency is crucial for ensuring the efficient transport of latency-sensitive fronthaul traffic in packet-switched Xhaul networks while maintaining the reliable operation of 5G and beyond Radio Access Networks (RANs). Deterministic worst-case (WC) models provide strict latency guarantees but tend to overestimate actual delays, resulting in resource over-provisioning and inefficient network utilization. To address this limitation, this study evaluates a data-driven Quantile Regression (QR) model for latency prediction in Time-Sensitive Networking (TSN)-enabled packet-switched Xhaul networks operating under dynamic traffic conditions. The proposed QR model estimates high-percentile (tail) latency values by leveraging both deterministic and queuing-related data features. Its performance is quantitatively compared with the WC estimator across diverse network topologies and traffic load scenarios. The results demonstrate that the QR model achieves significantly higher prediction accuracy—particularly for midhaul flows—while still maintaining compliance with latency constraints. Furthermore, when applied to dynamic Xhaul network operation, QR-based latency predictions enable a reduction in active processing-node utilization compared with WC-based estimations. These findings confirm that data-driven models can effectively complement deterministic methods in supporting latency-aware optimization and adaptive operation of 5G/6G Xhaul networks.
- New
- Research Article
- 10.3390/machines13121087
- Nov 25, 2025
- Machines
- Cheng Li + 4 more
Currently, fifth-generation (5G) communication has emerged as the most promising candidate for next-generation railway-dedicated communication systems. Condition monitoring of 5G networks is critical for ensuring the continuity and reliability of train–ground communications. In this paper, a real-time monitoring technology is proposed, which is based on generalized channel characteristics extracted from received Demodulation Reference Signals (DM-RSs). Furthermore, a corresponding monitoring system has been developed based on the Radio Frequency System on Chip (RFSoC). Experimental results demonstrate that the proposed condition monitoring system exhibits excellent performance: it can accurately measure key network metrics (including field strength, multipath components, and frequency offset) and enable real-time monitoring of the operational condition of 5G radio access networks (RAN) and on-board terminals. Future work will focus on integrating the monitoring system into on-board terminals.
- New
- Research Article
- 10.3390/electronics14234642
- Nov 25, 2025
- Electronics
- Faris Alsulami
Millimeter-wave radio access networks have a high level of security risks due to the vulnerability of having security threats at the beam level as hackers can exploit this by breaking network integrity and user privacy. This paper proposes BeamSecure-AI, an artificial intelligence-based framework that allows locating beam-level attacks and overcoming them in mmWave RAN networks in real-time. The proposed system combines deep reinforcement learning and explainable AI modules to enable it to dynamically detect threats and be transparent about the operations of the decision-making processes. We mathematically formulate the dynamic beam alignment patterns covering the multi-dimensional feature extraction through space, time, and spectral space. Experimental results validate the effectiveness of the proposed method across a range of attack scenarios, where significantly higher improvement in detection rates (96.7%) and response latency of 42.5 ms, with false-positive rates below ≤2.3%, are observed as compared to other methods. The framework can detect complex attacks such as beam stealing, jamming, and spoofing while maintaining low false-positive rates and consistent performance across urban, suburban, and rural deployment scenarios.
- New
- Research Article
- 10.1145/3768972
- Nov 24, 2025
- Proceedings of the ACM on Networking
- Haoran Wan + 1 more
Design for low latency networking is essential for tomorrow's interactive applications, but it is essential to deploy incrementally and universally at the network's last mile. While wired broadband ISPs are rolling out the leading queue occupancy signaling mechanisms, the cellular Radio Access Network (RAN), another important last mile to many users, lags behind these efforts. This paper proposes a new RAN design, L4Span, that abstracts the complexities of RAN queueing in a simple interface, thus tying the queue state of the RAN to end-to-end low-latency signaling all the way back to the content server. At millisecond-level timescales, L4Span predicts the RAN's queuing occupancy and performs ECN marking for both low-latency and classic flows. L4Span is lightweight, requiring minimal RAN modifications, and remains 3GPP and O-RAN compliant for maximum ease of deployment. We implement a prototype on the srsRAN open-source software in C++. Our evaluation compares the performance of low-latency as well as classic flows with or without the deployment of L4Span in various wireless channel conditions. Results show that L4Span reduces the one-way delay of both low-latency and classic flows by up to 98%, while simultaneously maintaining near line-rate throughput. The code is available at https://github.com/PrincetonUniversity/L4Span .
- New
- Research Article
- 10.1145/3769001
- Nov 24, 2025
- Proceedings of the ACM on Networking
- N Cameron Matson + 2 more
Non-terrestrial networks (NTNs) have been proposed as a key component of next generation of mobile networks. Satellite networks can potentially enable connectivity anywhere on earth, and modern satellites are capable of providing high throughput connectivity. However, there remain several open questions about how NTN will work in practice. We show that blindly applying terrestrial RAN architectures and connectivity management algorithms to the NTN context fails to deliver on multiple network and user level metrics. In this work we present an NTN system that takes a holistic view of the problem, considering both the radio access network architecture as well as the algorithms that drive user session orchestration in the face of satellite mobility. Using a realistic satellite emulation platform as well as large-scale simulations we show that our proposed system outperforms baseline solutions in simultaneously balancing multiple key performance indicators including throughput, coverage, and stability by reducing the impact of satellite mobility.
- New
- Research Article
- 10.1145/3769000
- Nov 24, 2025
- Proceedings of the ACM on Networking
- Pranshav Gajjar + 6 more
The impending adoption of Open Radio Access Network (O-RAN) is fueling innovation in the RAN towards data-driven operation. Unlike traditional RAN where the RAN data and its usage is restricted within proprietary and monolithic RAN equipment, the O-RAN architecture opens up access to RAN data (i.e., network telemetry), via RAN intelligent controllers (RICs), to third-party machine learning (ML) powered applications - rApps and xApps - to optimize RAN operations. Consequently, a major focus has been placed on leveraging RAN data to unlock greater efficiency gains. However, there is an increasing recognition that RAN data access to apps could become a source of vulnerability and be exploited by malicious actors. Motivated by this, we carry out a comprehensive investigation of data vulnerabilities on both xApps and rApps, respectively hosted in Near- and Non-real-time (RT) RIC components of O-RAN. Our investigation begins by qualitatively analyzing the O-RAN security mechanisms and limitations relevant to xApps and rApps, such as their onboarding authentication process and RIC database access mechanisms. Considering a threat model informed by this analysis, we design a viable and effective black-box evasion attack strategy targeting O-RAN RIC Apps while accounting for the stringent timing constraints (particularly for xApps) and attack effectiveness. The attack strategy employs four key techniques: the model cloning algorithm, input-specific perturbations, universal adversarial perturbations (UAPs), and targeted UAPs. This strategy targets ML models used by both xApps and rApps within the O-RAN system, aiming to degrade network performance. We experimentally validate the effectiveness of the designed evasion attack strategy and quantify the scale of performance degradation using a real-world O-RAN testbed and emulation environments. This evaluation is conducted using the Interference Classification xApp and the Power Saving rApp as representative applications for near-RT and non-RT RICs, respectively. Further, we show that the attack strategy is effective against prominent defense techniques for adversarial ML, such as defensive distillation and adversarial training.
- New
- Research Article
- 10.1145/3775060
- Nov 15, 2025
- ACM Transactions on Privacy and Security
- Joshua Groen + 11 more
5G and beyond cellular systems embrace the disaggregation of Radio Access Network (RAN) components, exemplified by the evolution of the fronthaul (FH) connection between cellular baseband and radio unit equipment. Crucially, synchronization over the FH is pivotal for reliable 5G services. In recent years, there has been a push to move these links to an Ethernet-based packet network topology, leveraging existing standards and ongoing research for Time-Sensitive Networking (TSN). However, TSN standards, such as Precision Time Protocol (PTP), focus on performance with little to no concern for security. This increases the exposure of the open FH to security risks. Attacks targeting synchronization mechanisms pose significant threats, potentially disrupting 5G networks and impairing connectivity. In this paper, we demonstrate the impact of successful spoofing and replay attacks against PTP synchronization. We show how a spoofing attack is able to cause a production-ready O-RAN and 5G-compliant private cellular base station to catastrophically fail within 2 seconds of the attack, necessitating manual intervention to restore full network operations. To counter this, we design a (ML)-based monitoring solution capable of detecting various malicious attacks with over 97.5% accuracy.
- Research Article
- 10.1109/tcomm.2025.3594055
- Nov 1, 2025
- IEEE Transactions on Communications
- Ziyang Zhang + 6 more
Performance Analysis of uRLLC in a Scalable Cell-Free Radio Access Network System
- Research Article
- 10.69996/jcai.2025023
- Oct 31, 2025
- Journal of Computer Allied Intelligence
- Karthik Kumar Vaigandla
The exponential growth in mobile data demand, fueled by the proliferation of smart devices, IoT applications, and multimedia services, has posed significant challenges to the performance andscalability of 5G networks. Traditional network traffic management techniques are increasingly insufficient to meet the dynamic and complex requirements of modern wireless communication systems. In response, deep learning (DL) has emerged as a powerful tool to enable intelligent, adaptive, and realtime optimization of mobile network traffic in 5G environments. This review provides a comprehensive overview of the integration of deep learning algorithms into various layers of the 5G architecture, including radio access networks (RAN), core networks, and edge computing platforms. Key DL models such as convolutional neural networks (CNNs), recurrent neural networks (RNNs), long short-term memory (LSTM), deep reinforcement learning (DRL), and graph neural networks (GNNs) are analyzed in terms of their application to traffic prediction, congestion control, resource allocation, and quality of service (QoS) enhancement. The article also explores existing challenges such as data scarcity, model interpretability, latency constraints, and deployment complexity. Furthermore, it discusses emerging trends like federated learning, transfer learning, and edge intelligence as promising directions for future research. By synthesizing state-of-the-art contributions, this review highlights the transformative potential of DL in building efficient, resilient, and autonomous 5G networks.
- Research Article
- 10.3390/electronics14214197
- Oct 27, 2025
- Electronics
- Shruti Sharma + 1 more
In this study, we developed an energy-efficient multi-user-associated optimization method involving a massive multi-input multi-output (M-MIMO) system-enabled Cloud Radio Access Network (C-RAN) in Full-Duplex (FD) mode. Maximization of energy efficiency (EE) was achieved with user association. We compose the non-convex multi-objective optimization (MOO) problem for resource allocation and user association in C-RAN. The resultant non-convex MOO problem is non-deterministic polynomial (NP) hard. To tackle this complexity, we find a trade-off between achievable rate and energy consumption. We first reaffirm the problem as an MOO targeting high throughput and minimizing energy consumption instantaneously. By using the epsilon (ε)-constraint method, we transform MOO to an equivalent single objective optimization (SOO) problem by majorization–minimization (MM) approach that enables the transformation of binaries into continuous variables. Further, we propose a multi-objective resource allocation algorithm to obtain a Pareto optimal solution. The simulation results show a significant gain in EE of C-RAN achieved through our proposed MOO algorithm. Our results also show remarkable trade-offs between EE and spectral efficiency (SE).
- Research Article
- 10.1038/s44172-025-00524-0
- Oct 21, 2025
- Communications Engineering
- Saber Hassouna + 8 more
Open Radio Access Networks (O-RAN) offer a flexible RAN architecture for future 6G systems, yet their complexity and lack of real-world testbeds pose interoperability challenges, particularly with emerging software platforms and robotic systems. Here we present a real-world software-defined radio testbed based on an open-source 4G long-term evolution (LTE) system, integrated with the near-real-time (Near-RT) RAN Intelligent Controller (RIC) via standard O-RAN E2 interfaces. It enables connectivity with robotic end devices such as a haptic controller and robotic arm, demonstrating the activation of E2 functionality within a live RAN environment. The testbed enables haptic operation with sub-one-second latency and block error rate (BLER) under 12% for tasks such as dental inspection use cases. We also demonstrate replacement of software-defined radios (SDRs) with low-power mobile dongles, achieving comparable 10 Mbps throughput while cutting power consumption by 90%. This setup establishes a foundation for advancing research and integration in managing next-generation RANs.
- Research Article
- 10.1371/journal.pone.0330226
- Oct 9, 2025
- PLOS One
- Bosen Zeng + 1 more
The open radio access network (O-RAN) architecture facilitates intelligent radio resource management via RAN intelligent controllers (RICs). Deep reinforcement learning (DRL) algorithms are integrated into RICs to address dynamic O-RAN slicing challenges. However, DRL-based O-RAN slicing suffers from instability and performance degradation when deployed on unseen tasks. We propose M2DQN, a hybrid framework that combines multi-task learning (MTL) and meta-learning to optimize DQN initialization parameters for rapid adaptation. Our method decouples the DQN into two components: shared layers trained via MTL to capture cross-task representations, and task-specific layers optimized through meta-learning for efficient fine-tuning. Experiments in an open-source network slicing environment demonstrate that M2DQN outperforms MTL, meta-learning, and policy reuse baselines, achieving improved initial performance across 91 unseen tasks. This demonstrates an effective balance between transferability and adaptability. Code is available at: https://github.com/bszeng/M2DQN.
- Research Article
- 10.1145/3770919
- Oct 8, 2025
- Distributed Ledger Technologies: Research and Practice
- Engin Zeydan + 4 more
Self-Sovereign Identity (SSI) is a recent development in identity and access management that is based on Distributed Ledger Technology (DLT) and offers individuals and organizations control over their data. In contrast, Open Radio Access Networks (O-RAN) provide a platform for sharing radio infrastructure and data between users and mobile network operators (MNOs). Therefore, the integration of SSI with O-RAN provides an opportunity for decentralized, secure identity management that promotes a more transparent, efficient, and user-centric network environment. First, we propose the general architecture and perform a detailed threat model and security analysis, in which we identify the potential vulnerabilities and explain how our multi-layered security strategy mitigates these risks. The authentication process is also strengthened by the integration of Quantum Key Distribution (QKD) to ensure quantum-resistant security. We have carried out simulations to evaluate the efficiency of our system for credential operations, revealing that the average time for most credential operations is within a reasonable range (between 39- 750 ms), even when the request volume increases to 100,000. The proposed system efficiently maintains credential offering, presentation, connection creations, signing, and credential revocation times, which emphasizes its ability to effectively balance security and performance. In addition, our QKD simulations show several key management performance metrics. In the impact discussions, we address the potential integration of QKD into the authentication process to significantly increase security, with the system effectively managing key management and authentication times in a scalable manner. In addition, at the end of the paper, we discuss challenges, rationales, and technical aspects in O-RAN environments.
- Research Article
- 10.3389/frcmn.2025.1567879
- Oct 3, 2025
- Frontiers in Communications and Networks
- Urvashi Chaudhary + 2 more
This paper analyzes the channel estimation of rate splitting multiple access (RSMA) wireless network through the full-duplex amplify-and-forward (AF) relay. Basically, full-duplex transmission can improve temporal efficiency, however the loop interference is an unavoidable problem that occurs in the strong user of this proposed network. The orthogonal frequency division multiplexing (OFDM) system is used to provide high data rate communication, assuming the presence of phase noise (PN) in local oscillators. Using the least square (LS) estimate, the channel coefficients of the proposed RSMA relay network are estimated. In addition, convex optimization techniques are applied to estimate the phase noise components of this network. The problem is formulated by optimizing phase noise under transmit power constraints. We analyze the Bit Error Rate (BER) performance of the proposed network under binary phase shift keying (BPSK) modulation and 16-quadrature amplitude modulation (QAM). Simulation results demonstrate that channel estimation achieves better performance after the PN compensation.
- Research Article
- 10.1016/j.bcra.2025.100395
- Oct 1, 2025
- Blockchain: Research and Applications
- Daniel Hindemburg De Miranda Marques + 1 more
Distributed Ledgers and Security Mechanisms on Radio Access Networks: A Systematic Review
- Research Article
- 10.1016/j.icte.2025.10.009
- Oct 1, 2025
- ICT Express
- Tacettin Ayar + 1 more
PROPER: A PROxy pair for uplink PERformance enhancement in wireless access networks