Abstract

AbstractProblems dealing with assignment of clients to servers have been widely studied. However, they usually do not model the fact that the delay incurred by a client is a function of both the distance to the assigned server and the load on this server, under a given assignment. We study a problem referred to as the load‐distance balancing (LDB) problem, where the objective is assigning a set of clients to a set of given servers. Each client suffers a delay, that is, the sum of the network delay (which is proportional to the distance to its server) and the congestion delay at this server, a nondecreasing function of the number of clients assigned to the server.We address two flavors of LDB—the first one seeking to minimize the maximum incurred delay, and the second one targeted for minimizing the average delay. For the first variation, we present hardness results, a best possible approximation algorithm, and an optimal algorithm for a special case of linear placement of clients and servers. For the second one, we show the problem is NP‐hard in general, and present a 2‐approximation for concave delay functions and an exact algorithm, if the delay function is convex. We also consider the game theoretic version of the second problem and show the price of stability of the game is at most 2 and at least 4/3. © 2011 Wiley Periodicals, Inc. NETWORKS, 2012

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call