Abstract

Increasing number of connected devices and the volume of data generated by these devices pushing current generation of cellular network to their limits. Cross layer designs (CLD) are recommended for the designing a better performing network applications. 5G standard for next generation network includes the provisions for CLD within the standard itself to cater resource demand from such devices. 5G along with its predecessors is going to coexist providing a heterogeneous connectivity for a user. Irrespective of generation, radio access network (RAN) still remained a bottleneck and drastically affects the performance. Signal strength is a metric for RAN which is used in several decisions and is an important parameter for CLD. In simulation of CLD signal strength is modeled using established path loss and fading models wherein distance is the only variable, similarly, noise is modeled as a Gaussian distribution. Thus, with static user received signal strength (RSS) appears as Gaussian distribution but in reality the RSS is non-uniform with sharp peaks and long tail. Simulation with incorrect models result into solutions that lag far apart in performance than the actual outcomes, hence, correct modeling of RSS is a decisive factor. In this empirical study data collected for RSS using NeSen App by colocated multiple smartphones. This study proves that RSS recorded follows t-Distribution with mean between [-71, -77]dbm, standard deviation between [3.75, 4.85] and degree of freedom in range of [6.8, 10.5].

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call