Abstract

We design approximation algorithms for a number of fundamental optimization problems in metric spaces, namely computing separating and padded decompositions, sparse covers, and metric triangulations. Our work is the first to emphasize relative guarantees that compare the produced solution to the optimal one for the input at hand. By contrast, the extensive previous work on these topics has sought absolute bounds that hold for every possible metric space (or for a family of metrics). While absolute bounds typically translate to relative ones, our algorithms provide significantly better relative guarantees, using a rather different algorithm. Our technical approach is to cast a number of metric clustering problems that have been well studied---but almost always as disparate problems---into a common modeling and algorithmic framework, which we call the consistent labeling problem. Having identified the common features of all of these problems, we provide a family of linear programming relaxations and simple randomized rounding procedures that achieve provably good approximation guarantees.

Highlights

  • Metric spaces1 arise naturally in a variety of computational settings, and are commonly used to model diverse data sets such as latencies between nodes in the Internet, dissimilarity between objects such as documents and images, and the cost of traveling between physical locations

  • We study a number of basic metric clustering problems from an optimization perspective, and design polynomial-time algorithms that provably achieve a near-optimal clustering for every metric space

  • Most absolute bounds translate to relative ones, but our algorithms provide significantly better relative guarantees than those implied by the known absolute results

Read more

Summary

Introduction

Metric spaces arise naturally in a variety of computational settings, and are commonly used to model diverse data sets such as latencies between nodes in the Internet, dissimilarity between objects such as documents and images, and the cost of traveling between physical locations. An approximation algorithm guarantees a good solution provided only that one exists This requires one to design a “unified” algorithm that works regardless of the precise reason the input admits an improved bound. The approximation algorithm recovers, from the existential proof, an efficient algorithm that achieves nearly the same absolute guarantees In a sense, this is true for planar metrics, where, to date, no algorithm is known to efficiently determine whether an input metric is planar (or close to being planar). Our algorithms are based on linear programming (LP) relaxations, and automatically generate a “certificate” of near-optimality (namely, the optimal fractional solution) These simple certificates could possibly be used to prove that a good solution does not exist

Metric decompositions
Covering problems
Overview and techniques
Linear programming relaxations for CONSISTENT LABELING
Common ingredients
Maximization version
Minimization version
Maximum consistent labeling
Approximation algorithm for MAX CL and MAX FAIR CL
Separating decomposition
Padded decomposition
Minimum consistent labeling
COMPLETE CONSISTENT LABELING and SPARSE COVER
A bicriteria guarantee and application to METRIC TRIANGULATION
Hardness results
Findings
Concluding remarks
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call