Abstract

Web accesses follow Zipf's law with a good approximation, as measurements and observations indicate. This property provides an important tool in the design of Web caching architectures, because it allows designers to calculate appropriate cache sizes to achieve the desired hit ratios. The appropriate cache size combined with an LFU replacement policy achieves high cache hit rates. However, LFU replaces objects based on frequency measurements of past accesses. Thus, the system achieves high hit rates only after these measurements are reliable and converge to the final Zipf distribution. In this paper, we provide an analysis using Chernoff's bound and a calculation of an upper bound of the number of initial requests that need to be processed in order to obtain measurements of popularity with high confidence and a measured Zipf distribution which converges to the correct one.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call