Abstract

Recent decades have observed an exponential growth in network traffic, thanks to the increased popularity of real-time applications, such as live video chat and gaming. The resulting growth in the network infrastructure has made it difficult for the service providers to abide by the service level agreements, especially with regards to the quality-of-service guarantees. Predicting network latencies from noisy and missing measurements has therefore emerged as an important problem, and a plethora of solutions have been proposed for the same. Existing network latency predictions rely either on Euclidean embedding or matrix completion methods. This work considers the estimation and prediction of network latencies from a sequence of noisy and incomplete latency matrices collected over time. An adaptive matrix completion algorithm is proposed that can handle streaming data at low computational complexity. The performance of the proposed algorithm is characterized both in theory and using a real dataset, demonstrating its viability as a network monitoring tool.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.