Abstract

In this paper, we model the random multiuser multi-channel access network by using the occupancy problem from probability theory, and we combine this with a network interference model in order to derive the achievable throughput capacity of such networks. Furthermore, we compare the random multi-channel access with a system in which the users are able to minimize the channels occupancy. Besides, we show that the sampling rate can be reduced under the Nyquist rate if the use of the spectrum resource is known at the gateway side. This scenario is referred as cognitive radio context. The mathematical developments and results are illustrated through various simulations results. The proposed model is particularly relevant in analyzing the performance of networks where the users are not synchronized neither in time nor in frequency as it is often the case in various Internet of Things (IoT) applications.

Highlights

  • The issue of transmitting asynchronous signals on a single channel has been studied for several decades [1] and it led to some protocols such as ALOHA (ALOHAnet) proposed by N

  • In addition to the throughput capacity analysis, we show that the sampling rate at the gateway can be reduced in random multi-channel access, as soon as the spectrum resource is known at the gateway

  • Let (Nc1, Nu1) a couple of parameters leading to the binomial probability mass function P1(R), and (Nc2, Nu2) another couple of parameters such that Nc2 = 2Nc1 and Nu2 = 2Nu1 leading to the binomial probability mass function P2(R)

Read more

Summary

Introduction

The issue of transmitting asynchronous signals on a single channel has been studied for several decades [1] and it led to some protocols such as ALOHA (ALOHAnet) proposed by N. Many solutions have been proposed in the literature to overcome the distortions induced by the collisions among the transmitted signals. The model has been extended to the case of random frequency channel access besides random time channel access [5]. The authors proposed to model the interferences induced by the collisions of ultra narrow-band signals featuring a random frequency channel access in a context of the Internet-of-Things (IoT) applications. Collisions occur at the gateway side (i.e. the node between the UEs and the network) in many applications, since the latter do not coordinate nor schedule the uplink transmissions from user (e.g. objects)

Objectives
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.