Abstract

Content Centric Networking (CCN) is an emerging network architecture, shifting from an end-to-end connection to a content centric communication model. Each router in CCN has a content store module to cache the chunks passed by, and is arranged in an arbitrary network topology. It is important to allocate an appropriate cache size to each router in order to both improve the network performance and reduce the economic investment. Previous works have proposed several heterogeneous cache allocation schemes, but the gain brought by these schemes is not obvious. In this paper, we introduce a data mining method into the cache size allocation. The proposed algorithm uses manifold learning to analyze the regularity of network traffic and user behaviors, and classify routers based on their roles in the content delivery. Guided by the manifold learning embedding results, a novel cache size optimization scheme is developed. Extensive experiments have been performed to evaluate the proposed scheme. Simulation results show that the proposed scheme outperforms the existing cache allocation schemes in CCN.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.