Abstract

Modern mobile internet networks are becoming heavier and denser. Also it is not regularly planned, and becoming more heterogeneous. The explosive growth in the usage of smartphones poses numerous challenges for LTE cellular networks design and implementation. The performance of LTE networks with bursty and self-similar traffic has become a major challenge. Accurate modeling of the data generated by each connected wireless device is important for properly investigating the performance of LTE networks. This paper presents a mathematical model for LTE networks using queuing theory considering the influence of various application types. Using sporadic source traffic feeding to the queue of the evolved node B and with the exponential service time assumption, we construct a queuing model to estimate the performance of LTE networks. We use the performance model presented in this paper to study the influence of various application categories on the performance of LTE cellular networks. Also we validate our model with simulation using NS3 simulator with different scenarios.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call