Abstract
Cloud computing platforms enable applications to offer low-latency services to users by deploying data storage in multiple geo-distributed data centers. In this paper, through benchmark measurements on Amazon AWS and Microsoft Azure together with an analysis of a large-scale dataset collected from a major cloud CDN provider, we identify the high tail latency problem in cloud CDNs, which can substantially undermine the efficacy of cloud CDNs. One crucial idea to reduce the tail latency is to send requests in parallel to multiple clouds in cloud CDNs. However, since application providers often have a budget for using cloud services, deciding how many chunks to download from each cloud and when to download chunks in a cost-efficient manner still remain as open problems in our concerned scenario. To address the problem, we present TailCutter, a workload scheduling framework that aims at optimizing the tail latency while meeting cost constraints given by application providers. Specifically, we formulate the tail latency minimization (TLM) problem in cloud CDNs and design the receding horizon control based maximum tail minimization algorithm (RHC-based MTMA) to efficiently solve the TLM problem in practice. We implement TailCutter across multiple data centers of Amazon AWS and Microsoft Azure. Extensive evaluations using a large-scale real-world data trace (collected from a major ISP) illustrate that TailCutter can reduce up to 58.9% of the 100th-percentile user-perceived latency, as compared with alternative solutions under the cost constraint.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.