Abstract

This paper examines a novel cache management policy applied to non-collaborative and collaborative environments of more than one proxy server that serve homogeneous or even heterogeneous client requests for video streaming over the Internet. This cache management policy, which we call LRLFU because of a combination between an LRU and an LFU policy, is capable of capturing the changing popularities of the various videos by attaching a caching value to every video according to how recently and how frequently the video was requested, and decides to cache the most `valuable' videos. Our event-driven simulations have shown that LRLFU when applied to a simple non-collaborative topology of proxies and compared with previous work in this area (1) improve the byte-hit ratio (BHR), (2) significantly reduce the fraction of user requests with delayed starts and (3) require less CPU overhead. Furthermore, our simulation results have shown that the collaborative hierarchical tree topology of proxies that we examine achieves a much higher BHR when using the same overall cache capacity with the simple topology and in general provides better performance characteristics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call