Mobile telecommunications are converging towards all-IP solutions. This is the case of the Long Term Evolution (LTE) technology that, having no circuit-switched bearer to support voice traffic, needs a dedicated VoIP infrastructure, which often relies on the IP Multimedia Subsystem architecture. Most telecom operators implement LTE-A, an advanced version of LTE often marketed as 4G+, which achieves data rate peaks of 300 Mbps. Yet, although such novel technology boosts the access to advanced multimedia contents and services, telco operators continue to consider the VoIP market as the major revenue for their business. In this work, the authors propose a detailed performance assessment of VoIP traffic by carrying out experimental trials across a real LTE-A environment. The experimental campaign consists of two stages. First, we characterize VoIP calls between fixed and mobile terminals, based on a dataset that includes more than 750,000 data-voice packets. We analyze quality-of-service metrics such as round-trip time (RTT) and jitter, to capture the influence of uncontrolled factors that typically appear in real-world settings. In the second stage, we further consider VoIP flows across a range of codecs, looking at the trade-offs between quality and bandwidth consumption. Moreover, we propose a statistical characterization of jitter and RTT (representing the most critical parameters), identifying the optimal approximating distribution, namely the Generalized Extreme Value (GEV). The estimation of parameters through the Maximum Likelihood criterion, leads us to reveal both the short- and long-tail behaviour for jitter and RTT, respectively.
Read full abstract