Abstract

It is well known that transport protocol performance is severely hindered by wireless channel impairments. We study the applicability of Machine Learning (ML) techniques to predict congestion status of 5G access networks, in particular mmWave links. We use realistic traces, using the 3GPP channel models, without being affected using legacy congestion-control solutions. We start by identifying the metrics that might be exploited from the transport layer to learn the congestion state: delay and inter-arrival time. We formally study their correlation with the perceived congestion, which we ascertain based on buffer length variation. Then, we conduct an extensive analysis of various unsupervised and supervised solutions, which are used as a benchmark. The results yield that unsupervised ML solutions can detect a large percentage of congestion situations and they could thus bring interesting possibilities when designing congestion-control solutions for next-generation transport protocols.

Highlights

  • Millimeter wave is believed to be one of the key radio technologies to cope with the capacity requirements of 5G communications [1,2]

  • In this work we focus on the analysis of different machine learning algorithms over a single mmWave channel, leaving multi-channel scenarios as an extension that we will tackle in our future work

  • In the following we describe the main characteristics of the analyzed scenarios, and we discuss the obtained parameters, which will be used to derive congestion metrics

Read more

Summary

Introduction

Millimeter wave (mmWave) is believed to be one of the key radio technologies to cope with the capacity requirements of 5G communications [1,2]. MmWave channel exhibits high oxygen absorption, diffraction and penetration losses, which lead to highly varying physical capacity. This variability might be perceived as congestion, and mechanisms could be triggered and so reduce the traffic rate. This way, transport layer protocols, such as Transmission Control Protocol (TCP), cannot fully harness the communication capacity. They suffer from performance hindering, when buffers are small [4] These algorithms do not appropriately deal with varying throughput reduction over wireless links, which cause an infra-use of the corresponding resources (wireless channel)

Objectives
Methods
Findings
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.