Abstract

In computer science, there exist a large number of optimization problems defined on graphs, that is to find a best node state configuration or a network structure, such that the designed objective function is optimized under some constraints. However, these problems are notorious for their hardness to solve, because most of them are NP-hard or NP-complete. Although traditional general methods such as simulated annealing (SA), genetic algorithms (GA), and so forth have been devised to these hard problems, their accuracy and time consumption are not satisfying in practice. In this work, we proposed a simple, fast, and general algorithm framework based on advanced automatic differentiation technique empowered by deep learning frameworks. By introducing Gumbel-softmax technique, we can optimize the objective function directly by gradient descent algorithm regardless of the discrete nature of variables. We also introduce evolution strategy to parallel version of our algorithm. We test our algorithm on four representative optimization problems on graph including modularity optimization from network science, Sherrington–Kirkpatrick (SK) model from statistical physics, maximum independent set (MIS) and minimum vertex cover (MVC) problem from combinatorial optimization on graph, and Influence Maximization problem from computational social science. High-quality solutions can be obtained with much less time-consuming compared to the traditional approaches.

Highlights

  • In computer science, there exist a large number of optimization problems defined on graphs, e.g., maximal independent set (MIS) and minimum vertex cover (MVC) problems [1]

  • We present a novel general optimization framework based on automatic differentiation technique and Gumbel-softmax, including Gumbel-softmax optimization (GSO) [20] and Evolutionary Gumbel-softmax optimization (EvoGSO)

  • The proposed algorithm In [20], we proposed Gumbel-softmax optimization (GSO), a novel general method for solving combinatorial optimization problems on graphs

Read more

Summary

Introduction

There exist a large number of optimization problems defined on graphs, e.g., maximal independent set (MIS) and minimum vertex cover (MVC) problems [1]. It has been applied on various machine learning problems [13, 14] With reparameterization trick such as Gumbel-softmax, it is possible to treat many discrete optimization problems on graphs as continuous optimization problems [15] and apply a series of gradient descent-based algorithms [16]. PK ) , because the unitary element will appear on the ith element in the one-hot vector with probability pi ; the computation of Gumbel-softmax function can simulate the sampling process This technique allows us to pass gradients directly through the “sampling” process, because all the operations in Eq 2 are differentiable. We can take full advantage of GPU acceleration and obtain better results more likely

Objective function
Experiments
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call