Abstract

Caching popular contents at radio access networks is a promising approach to improve the content delivery efficiency. Most of the existing content delivery schemes focus on the perspective of content providers, paying less attention to the service demand of content requesters. In this paper, we investigate the content delivery policy of a mobile device with service delay constraint in a cache- enabled heterogeneous network (HetNet), where a macro base station (MBS) is overlaid with some small base stations (SBS) with caches. In the considered network, the mobile device needs to make content delivery decisions based on the time, cache state, and signal-to-interference-plus-noise ratio (SINR) state. The problem of solving an optimal content delivery policy is modeled as a Markov decision process (MDP), where the objective is to minimize the delivery cost of the mobile device under the constraint of content service deadline. In order to address this problem, we propose a reinforcement learning (RL) algorithm to learn the optimal policy. The simulation results demonstrate that our proposed RL-based policy achieves a significant improvement in content delivery cost compared with other benchmark solutions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call