Abstract

To enhance network performance, the named data networking architecture (NDN) caches data-packets in the network nodes on a downstream network path. Then it uses such cached requested data-packets to respond to new request-packets. Hence, a cache management scheme (CMS) is the essential point of NDN. CMS generally considers two main factors. One is a short response time and the other is storage efficiency. To rapidly respond to requests, CMS generally tries to cache data-packets near users as much as possible. To efficiently manage storage, it uses the popularity of the data. That is, proportionally to the popularity of the data, it increases the number of nodes caching data-packets and manages the lifetime of caches. However, few data objects are as popular as many users globally enjoy in the real world. Hence, if the assumptions about content- usage are practically changed, CMS can waste cache storage and not significantly improve network efficiency. We show that many caches have expired and are not used at all. To improve such inefficiency of CMS, this paper propose to simultaneously apply two cache decision factors, the expected frequency of a cache hit and the popularity of data. That is, it proposes to gradually cache transmitted data in nodes in which their expected cache-usage frequency is relatively high. To show the effectiveness of our proposal, we implement LdC (a limited domain cache policy) and evaluate the performance of LdC. The evaluation result shows that it can enhance the cache-storage efficiency by up to 65% compared with existing CMS without degrading the network efficiency.

Highlights

  • Since P2P networking technology was first introduced, it has become the general approach of advanced network technologies to use caches, which saves multiple nodes

  • Considering the transmission overheads for forwarding Interest/Data, the storage overheads for Considering the transmission overheads for forwarding Interest/Data, the storage overheads for caching Data, and the computation overheads for searching the matched Data in-network caches, many caching Data, and the computation overheads for searching the matched Data in-network caches, cache management scheme (CMS) proposed to selectively and gradually cache transmitted Data propositionally to the request many CMSs proposed to selectively and gradually cache transmitted Data propositionally to the frequency of the Data [5,6,7,8,9,10,11]

  • We propose a simple cache policy using the like LCD/WAVE, it gradually increases the number of nodes caching Data

Read more

Summary

Introduction

Since P2P networking technology was first introduced, it has become the general approach of advanced network technologies to use caches (i.e., replica of data), which saves multiple nodes. To cache Data as near a requester-side edge network (RsEN) as possible The latter approach expects that Interest generated in RsEN can be responded by nodes in the RsEN without transmitting the Interest to the core network. It should be assumed that ‘many’ nodes of ‘many’ RsENs have cached relevant Data It would not be a practical assumption because most content objects are not as popular as expected. As existing CMSs, we gradually increase the number of nodes caching Data proportionally to the generation frequency of the Interests requesting the Data. We select nodes such that the cache hit frequency of these nodes is relatively high When applying these factors, this proposed scheme can enhance the storage efficiency of nodes up to 65% compared with existing CMSs without degrading the network efficiency

NDN Overview
When the PIT
NDN Cache Management Overview
Section 5.
A Limited
Performance Evaluation
Findings
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.