Abstract

The Multipath TCP (MPTCP) protocol, featured by its ability of capacity aggregation across multiple links and connectivity maintenance against single-path failure, has been attracting increasing attention from the industry and academy. Multipath packet scheduling is a unique and fundamental mechanism for the design and implementation of MPTCP, which is responsible for distributing the traffic over multiple subflows. The existing multipath schedulers are facing the challenges of network heterogeneities, comprehensive QoS goals, and dynamic environments, etc. To address these challenges, we propose ReLeS, a Reinforcement Learning based Scheduler for MPTCP. ReLeS uses modern deep reinforcement learning (DRL) techniques to learn a neural network to generate the control policy for packet scheduling. It adopts a comprehensive reward function that takes diverse QoS characteristics into consideration to optimize packet scheduling. To support real-time scheduling, we propose an asynchronous training algorithm that enables parallel execution of packet scheduling, data collecting, and neural network training. We implement ReLeS in the Linux kernel and evaluate it over both emulated and real network conditions. Extensive experiments show that ReLeS significantly outperforms the state-of-the-art schedulers.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.