Abstract

Nowadays video content has contributed to the majority of Internet traffic, which brings great challenge to the network infrastructure. Fortunately, the emergence of edge computing has provided a promising way to reduce the video load on the network by caching contents closer to users.But caching replacement algorithm is essential for the cache efficiency considering the limited cache space under existing edge-assisted network architecture. To investigate the challenges and opportunities inside, we first measure the performance of five state-of-the-art caching algorithms based on three real-world datasets. Our observation shows that state-of-the-art caching replacement algorithms suffer from following weaknesses: 1) the rule-based replacement approachs (e.g., LFU,LRU) cannot adapt under different scenarios; 2) data-driven forecast approaches only work efficiently on specific scenarios or datasets, as the extracted features working on one dataset may not work on another one. Motivated by these observations and edge-assisted computation capacity, we then propose an edge-assisted intelligent caching replacement framework LSTM-C based on deep Long Short-Term Memory network, which contains two types of modules: 1) four basic modules manage the coordination among content requests, content replace, cache space, service management; 2) three learning-based modules enable the online deep learning to provide intelligent caching strategy. Supported by this design, LSTM-C learns the pattern of content popularity at long and short time scales as well as determines the cache replacement policy. Most important, LSTM-C represents the request pattern with built-in memory cells, thus requires no data pre-processing, pre-programmed model or additional information. Our experiment results show that LSTM-C outperforms state-of-the-art methods in cache hit rate on three real-traces of video requests. When the cache size is limited, LSTM-C outperforms baselines by 20%~32% in cache hit rate. We also show that the training and predicting time of one iteration are $8.6~ms$ and $300~\mu s$ on average respectively, which are fast enough for online operations.

Highlights

  • In the past decade, we are witnessing the explosive growth of Internet video, which accounts for 60% of the Internet traffic in 2016, and is forecasted to double by 2021 according to Cisco’s recent report [1]

  • Our observation shows that state-of-the-art caching replacement algorithms suffer from following weaknesses: 1) the algorithms using rule-based replacement approach (e.g., Least Frequently Used (LFU), Least Recently Used (LRU)) cannot adapt under different scenarios, while the algorithms using data-driven forecast approach only work efficiently on specific scenarios or datasets, as the extracted features working on one dataset may not work on another one

  • Deploying in practice: In our current experiment, the LSTM-C runs under the OpenAI-Gym environment

Read more

Summary

Introduction

We are witnessing the explosive growth of Internet video, which accounts for 60% of the Internet traffic in 2016, and is forecasted to double by 2021 according to Cisco’s recent report [1]. The quality of individual video streams is rapidly improving as well, with a majority of them. The associate editor coordinating the review of this manuscript and approving it for publication was Honghao Gao. have become 1080p high resolution (HD) or even 4K ultra HD (UHD), posing significant challenges towards delivering high Quality of Service (QoS) to video consumers. Caching near the end users has become an indispensable component in Internet content distribution systems for decades. Given the sheer volume of video traffic, caching is critical to Internet video streaming, too [2]. IQiyi [3], a major video service providers in China, has nearly 6 billion hours of viewing per month [4], and the.

Objectives
Methods
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call