Abstract

Over the last few decades, the Internet has experienced tremendous growth in data traffic. This continuous growth due to the increase in the number of connected devices and platforms has dramatically boosted content consumption. However, retrieving content from the servers of Content Providers (CPs) can increase network traffic and incur high network delay and congestion. To address these challenges, we propose a joint deep learning and auction-based approach for congestion-aware caching in Named Data Networking (NDN), which aims to prevent congestion and minimize the content downloading delays. First, using recorded network traffic data on the Internet Service Provider (ISP) network, we propose a deep learning model to predict future traffic over transit links. Second, to prevent congestion and avoid high latency on transit links, which may experience congestion in the future; we propose a caching model that helps the ISP to cache content that has a high predicted future demand. Paid-content requires payment to be downloaded and cached. Therefore, we propose an auction mechanism to obtain paid-content at an optimal price. The simulation results show that our proposal prevents congestion and increases the profits of both ISPs and CPs.

Highlights

  • In recent years, Internet traffic has continued to increase due to the growing number of connected devices with emerging platforms such as digital assistants, virtual, augmented, and mixed reality

  • We propose a deep learning model that helps the Access Internet Service Provider (ISP) to record transit traffic volume, learn the transit traffic characteristics, and predict the traffic volume that needs to be sent over the transit link(s)

  • To prevent congestion and reduce the transit traffic volume passing through the congested transit links, we propose caching approach, where the Access ISP downloads and caches contents that have high predicted traffic volume over the transit link(s)

Read more

Summary

Introduction

Internet traffic has continued to increase due to the growing number of connected devices with emerging platforms such as digital assistants, virtual, augmented, and mixed reality. It is estimated, by the year 2020, there will be 50 billion things connected to the Internet, which is equivalent to six devices per person [1]. By the year 2020, there will be 50 billion things connected to the Internet, which is equivalent to six devices per person [1] This large-scale increase in both connected devices and platforms will tremendously increase content consumption. Caching content in close proximity to the users is one solution to reduce network traffic on transit links, i.e., bandwidth consumption [2,3,4,5,6,7,8].

Objectives
Methods
Findings
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call