Abstract

Edge caching has emerged as a promising technique against latency caused by explosive growth of mobile data traffic through caching popular contents at the edge networks. However, the dynamically changing content popularity nature and limited caching capacity make it challenging to design an effective caching scheme to reduce latency. To solve this, a learning-based hierarchical edge caching (LHEC) scheme is proposed in this work. We first propose a novel deep learning architecture, namely Stacked Autoencoder-Long Short Term Memory Network (SAE-LSTMNet) to capture both the correlation of the request patterns among different content and the periodicity in time domain to improve the prediction accuracy of the content popularity. Then, to predict the popularity of these newly-added contents, a dynamic content catalog is introduced and a similarity-based content popularity prediction (SCPP) approach is proposed. Based on the content popularity prediction, a hierarchical edge caching optimization problem is formulated to minimize the average content downloading latency. Since the formulated problem is NP-hard and difficult to be solved, a low-complexity algorithm is proposed to obtain the near-optimal solutions. Simulation results show that the proposed content popularity prediction approach outperforms up to 6.36% in terms of the mean absolute error compared with the state-of-the-art methods and the proposed LHEC scheme reduces the average downloading latency at about 5.3%~7.9% compared with those existing caching schemes.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.