Abstract

Discrete combinatorial optimization problems in real world are typically defined via an ensemble of potentially high dimensional measurements pertaining to all subjects of a system under study. We point out that such a data ensemble in fact embeds with system's information content that is not directly used in defining the combinatorial optimization problems. Can machine learning algorithms extract such information content and make combinatorial optimizing tasks more efficient? Would such algorithmic computations bring new perspectives into this classic topic of Applied Mathematics and Theoretical Computer Science? We show that answers to both questions are positive. One key reason is due to permutation invariance. That is, the data ensemble of subjects' measurement vectors is permutation invariant when it is represented through a subject-vs-measurement matrix. An unsupervised machine learning algorithm, called Data Mechanics (DM), is applied to find optimal permutations on row and column axes such that the permuted matrix reveals coupled deterministic and stochastic structures as the system's information content. The deterministic structures are shown to facilitate geometry-based divide-and-conquer scheme that helps optimizing task, while stochastic structures are used to generate an ensemble of mimicries retaining the deterministic structures, and then reveal the robustness pertaining to the original version of optimal solution. Two simulated systems, Assignment problem and Traveling Salesman problem, are considered. Beyond demonstrating computational advantages and intrinsic robustness in the two systems, we propose brand new robust optimal solutions. We believe such robust versions of optimal solutions are potentially more realistic and practical in real world settings.

Highlights

  • Discrete combinatorial optimization is a topic in Applied Mathematics and Computer Science

  • We demonstrate a machine learning algorithm, Data Mechanics, that can compute a geometry coupling deterministic multiscale blocks patterns with stochastic block-wise uniform randomness on a data matrix

  • When a discrete combinatorial optimization problem is defined upon a data matrix representing a system at one point in time, such a coupling geometry allows us to go beyond the classic idiosyncratic optimal solution

Read more

Summary

INTRODUCTION

Discrete combinatorial optimization is a topic in Applied Mathematics and Computer Science. The principle of statistical physics confirms that any microscopic state has to conform to the system’s macroscopic state Though this macrostate’s implicit nature prevents it from playing any role in defining a specific discrete combinatorial optimization problem, many of its information patterns are computable via machine learning algorithms. The machine learning paradigm employed here to build the block patterns on a matrix lattice is called Data Mechanics (DM), developed in a series of papers [20,21,22,23] Such block patterns are multiscale in the sense of being framed by two Ultrametric trees on row and column axes, respectively. The theme underpinning computing algorithms developed here is “divide-and-conquer.” This type of data-driven approach is new in combinatorial optimization, a similar idea has been used recently in machine learning for solving continuous optimization problems. This newly developed algorithm would be given after laying out principle ideas of DM for completeness and convenience purpose

DATA MECHANICS
Data Mechanics
Bipartite Network Mimicking and Its
Assignment Problem
Traveling Salesman Problem and Simulated Annealing
DISCUSSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call