Abstract
To support communications of a large number of deployed devices while guaranteeing limited signaling load, low energy consumption, and high reliability, future cellular systems require efficient random access protocols. However, how to address the collision resolution at the receiver is still the main bottleneck of these protocols. The network-assisted diversity multiple access (NDMA) protocol solves the issue and attains the highest potential throughput at the cost of keeping devices active to acquire feedback and repeating transmissions until successful decoding. In contrast, another potential approach is the feedback-free NDMA (FF-NDMA) protocol, in which devices do repeat packets in a pre-defined number of consecutive time slots without waiting for feedback associated with repetitions. Here, we investigate the FF-NDMA protocol from a cellular network perspective in order to elucidate under what circumstances this scheme is more energy efficient than NDMA. We characterize analytically the FF-NDMA protocol along with the multipacket reception model and a finite Markov chain. Analytic expressions for throughput, delay, capture probability, energy, and energy efficiency are derived. Then, clues for system design are established according to the different trade-offs studied. Simulation results show that FF-NDMA is more energy efficient than classical NDMA and HARQ-NDMA at low signal-to-noise ratio (SNR) and at medium SNR when the load increases.
Highlights
The fifth generation (5G) of cellular networks, set for availability around 2020, is expected to enable a fully mobile and connected society, characterized by a massive growth in connectivity and an increased density and volume of traffic
A wide range of requirements arise, such as scalability, rapid programmability, high capacity, security, reliability, availability, low latency, and long-life battery for devices [1]. All these requirements pave the way for machine-type communications (MTC), which enable the implementation of the Internet of Things (IoT) [2]
A symmetric scenario with an equal average signal-to-noise ratio (SNR) (i.e., γ ) for all devices is used. γ is determined by devices in worst propagation conditions, and so γ will be varied through simulations to emulate the different propagation conditions
Summary
The fifth generation (5G) of cellular networks, set for availability around 2020, is expected to enable a fully mobile and connected society, characterized by a massive growth in connectivity and an increased density and volume of traffic. MTC systems consider different use cases that range To address such a massive number of low-powered devices generating bursty traffic with low latency requirements, simple medium access control (MAC)-layer random access protocols of ALOHA-type are preferred because they offer a relatively straightforward implementation and can accommodate bursty devices in a shared communication channel [4, 5]. They are used in today’s most advanced cellular networks (as the random access channel (RACH) in LTE) [6] and are being considered in different MTC systems, such as LoRa [7], SigFox, enhanced MTC [8], narrowband (NB) LTE-M [9, 10], and NB-IoT [11,12,13]
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have