Abstract

We consider distributed caching strategies for networks in a model that takes in addition to remote accesses also local accesses into account. The goal is to minimize the congestion while obeying memory capacity constraints in the network. The on-line strategies are evaluated in a competitive analysis in which their costs are compared with the cost of an optimal off-line strategy. Previous results either depend on the network size or assume that the on-line strategies have increased memory capacity constraints in comparison to an optimal off-line strategy.(MATH) Our main result is a strategy for complete networks. For each node v, we are given memory capacity m(v) and load d(v) for a remote access. The load for a local access is one. For each application concerning a set X of shared data objects, with |X| ≤ Σv m(v) / d(v), the strategy achieves a competitive ratio of O(rmax / ravg · maxv(log(min(d(v),m(v))))) w.h.p., with respect to the congestion at the nodes, where rmax = maxv(m(v)) / minv(d(v)) and ravg = Σv m(v) / d(v) / number of nodes.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.