Recent years have witnessed a phenomenal increase in video traffic. Virtual content delivery networks (vCDNs) coordinate video content delivery through the use of computing and storage resources from the cloud and distributes content to edge nodes near consumers to reduce network traffic and improve service experience. An important objective of vCDNs is operation cost minimization. Since cloud data centers are geo-distributed, content transfer costs vary significantly with different data centers, i.e., the cost is high for retrieval from distant data centers and lower for nearby retrievals. Many popular caching algorithms in use today, such as Lru, do not consider cost when making caching decisions, and as a result, suffer from high data transfer costs and increased network congestion. On the other hand, cost-aware caching algorithms such as LandLord [1] are computationally inefficient, with time complexity scaling linearly to the amount of content in the vCDN. Such algorithms are unable to keep pace with the exponential growth in video content over time. In this paper, we propose Fmc(fast media caching), a cost-aware and highly efficient caching algorithm for vCDN delivery over geo-distributed data centers. The load cost of each content item is determined by both the item’s size and distance from the data center it is loaded from. We first prove that Fmc is kk−h+1 competitive under the resource augmentation paradigm, where Fmc and the optimal offline adversary have k and h amount of cache, resp., and k ≥ h. Also, we show our algorithm is straightforward and efficient, requiring only O(log m) time per cache access, where m is the number of data centers and is a small constant in practice. We conduct experimental studies on Fmc using both synthetic and YouTube traces. Our results show that Fmc has on average 50% and up to 66.7% lower cost than Lru. Besides, we show Fmc is much faster than LandLord, and the speedup scales linearly with cache size.
Read full abstract