Abstract

Most real world combinatorial optimization problems are affected by noise in the input data, thus behaving in the high noise limit like large disordered particle systems, e.g. spin glasses or random networks. Due to uncertainty in the input, optimization of such disordered instances should infer stable posterior distributions of solutions conditioned on the noisy input instance. The maximum entropy principle states that the most stable distribution given the noise influence is defined by the Gibbs distribution and it is characterized by the free energy. In this paper, we first provide rigorous asymptotics of the difficult problem to compute the free energy for two combinatorial optimization problems, namely the sparse Minimum Bisection Problem (sMBP) and Lawler's Quadratic Assignment Problem (LQAP). We prove that both problems exhibit phase transitions equivalent to the discontinuous behavior of Derrida's Random Energy Model (REM). Furthermore, the derived free energy asymptotics lead to a theoretical justification of a recently introduced concept [3] of Gibbs posterior agreement that measures stability of the Gibbs distributions when the cost function fluctuates due to randomness in the input. This relatively new stability concept may potentially provide a new method to select robust solutions for a large class of optimization problems.

Highlights

  • Combinatorial optimization arises in many real world settings and these problems are often notoriously difficult to solve due to data dependent noise in the parameters defining such instances

  • We first provide rigorous asymptotics of the difficult problem to compute the free energy for two combinatorial optimization problems, namely the sparse Minimum Bisection Problem and Lawler’s Quadratic Assignment Problem (LQAP)

  • In this paper we focus on two optimization problems, namely the sparse Minimum Bisection Problem and the Lawler Quadratic Assignment Problem (LQAP)

Read more

Summary

Introduction

Combinatorial optimization arises in many real world settings and these problems are often notoriously difficult to solve due to data dependent noise in the parameters defining such instances Algorithm design in noise affected real world settings requires both statistical as well as computational considerations: first, we have to ensure that outputs of algorithms are typical in a statistical sense, i.e., they have to occur with high probability. Such typical outputs have to be computable in an efficient way with efficient resources. We might require significantly different algorithmic resources (time and space) to calculate typical solutions for typical inputs compared to minimizing the empirical risk

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call