Abstract

The minimum cut problem, MC, and the special case of the vertex separator problem, consists in partitioning the set of nodes of a graph G into k subsets of given sizes in order to minimize the number of edges cut after removing the k-th set. Previous work on approximate solutions uses, in increasing strength and expense: eigenvalue, semidefinite programming, SDP, and doubly nonnegative, DNN, bounding techniques. In this paper, we derive strengthened SDP and DNN relaxations, and we propose a scalable algorithmic approach for efficiently evaluating, theoretically verifiable, both upper and lower bounds. Our stronger relaxations are based on a new gangster set, and we demonstrate how facial reduction, FR, fits in well to allow for regularized relaxations. Moreover, the FR appears to be perfectly well suited for a natural splitting of variables, and thus for the application of splitting methods. Here, we adopt the strictly contractive Peaceman-Rachford splitting method, sPRSM. Further, we bring useful redundant constraints back into the subproblems, and show empirically that this accelerates sPRSM.In addition, we employ new strategies for obtaining lower bounds and upper bounds of the optimal value of MC from approximate iterates of the sPRSM thus aiding in early termination of the algorithm. We compare our approach with others in the literature on random datasets and vertex separator problems. This illustrates the efficiency and robustness of our proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call