Content caching is an effective technique to alleviate the burden on backhaul links and reduce the traffic in cellular networks. In converged networks, broadcasting networks can push non-real-time popular services to different types of terminals during off-peak hours, while cellular networks are deployed to meet personalized needs. However, the storage capacity of terminals are usually limited. In this context, we study on the converged networks to push and cache the popular services in the router nodes close to the terminators. In this scheme, the most popular services are transmitted by the broadcasting base station and cached in the router nodes in a distributed cache network. Then, users can access the cached services in a more energy efficient manner. Due to the limited storage capacity of the router node, we assume that the user can access the cached services within two hops. Then, we formulate the service scheduling problem as a Markov Decision Process, aiming to maximize equivalent throughput (ET). Due to the large state space involved in the distributed cache network, it is quite challenging to obtain a tractable solution by the classical optimization algorithms. To handle this problem, a deep reinforcement learning based framework is proposed to tackle this problem. The simulation results show that the proposed algorithms are very effective; and they outperform the conventional one in term of the ET, especially when the users in the network subject to Poisson point process distribution.
Read full abstract