Abstract

The flourishing vehicular applications require vehicles to download huge amount of Internet data, which consumes significant backhaul bandwidth and considerable time for data content delivery. Caching the popular data at network edge station can alleviate the congestion at backhaul network and reduce the data delivery delay. In this paper, we propose a dynamic edge caching policy for Heterogeneous Vehicular Network via Reinforcement Learning on adaptive traffic intensity and hot content popularity. We aim to enhance the download rate of vehicles adapting to dynamic vehicle velocity and hot file pool by caching the popular content on heterogeneous network edge stations. The proposed policy makes use of real time information and jointly considers download rate for each file and utility for edge stations to improve overall download performance. Simulation results show that our proposed policy can achieve better download rate than random and fixed caching policies.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.