Abstract

The data placement problem arises in the design and operation of Content Delivery Networks—computer systems used to efficiently distribute Internet traffic to the users by replicating data objects (media files, applications, database queries, etc.) and caching them at multiple locations in the network. This allows not only to reduce the processing load on the server hardware, but also helps eliminating transmission network congestion. Currently all major Internet content providers entrust their offered services to such systems. In this paper we formulate the data placement problem as quadratic binary programming problem, taking into account server processing time, storage capacity and communication bandwidth. Two decomposition-based solution approaches are proposed: the Lagrangian relaxation and randomized rounding. Computational experiments are conducted in order to evaluate and compare the performance of presented algorithms.

Highlights

  • In the last decade the locational analysis has received a special attention from the computer networking community, due to the widespread development of large scale Internet applications

  • In this paper we formulate a very general class of optimization problems arising from the planning of content delivery network or other similar large scale data delivery system in the Internet

  • Our model incorporates all three resources that need to be taken into consideration in the context of computer systems: server processing time, storage capacity and communication bandwidth, and is essentially an extension of data placement problem devised in Baev et al (2008)

Read more

Summary

Introduction

In the last decade the locational analysis has received a special attention from the computer networking community, due to the widespread development of large scale Internet applications. The deployment of a number of application servers placed in different locations, providing replicated data to the users, has become a necessity for high-traffic Web sites. This allows to reduce the processing load on the server hardware, and helps eliminating transmission network congestion and improves reliability. The second decomposition is based on the linear programming relaxation and employs randomized rounding Both decompositions involve solving generalized assignment problem, which for many years has been a central problem in many applications of operations research. This problem does not admit constant factor approximations without violating capacity constraints. Three important problems that are of independent interest constitute the basis of proposed solutions: subgradient ascent method, generalized assignment problem and knapsack problem

Related work
Problem statement
Linearization
Decomposition based on Lagrangian relaxation
Decomposition with randomized rounding
Experimental results
Conclusions and further work
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call