Abstract

Edge caching has become a promising technology in future wireless networks owing to its remarkable ability to reduce peak data traffic. However, the storage resource can be limited in practice hence only a small amount of files can be cached. How to improve the cache hit ratio in finite-buffer caching based on the prediction of user demands has become an important problem. In this paper, we study caching policies with finite buffer by exploiting the prediction of a user's request time, referred to as request delay information (RDI). Based on RDI, we maximize the average cache hit ratio through a Markov decision process (MDP) approach. Specifically, we formulate an MDP problem and apply a modified value iteration algorithm to find an optimal caching policy. Moreover, we provide an upper bound and a lower bound for the cache hit ratio, as well as an analytical cache hit ratio with small buffers. To address the issue that the state space can be prohibitively large in practice, we present a low-complexity heuristic caching policy that is shown to be asymptotically optimal. Simulation results show that introducing RDI may bring significant cache hit ratio gain when the buffer size is limited.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call