Abstract

Hypergraph partitioning is an NP-hard problem that occurs in many computer science applications where it is necessary to reduce large problems into a number of smaller, computationally tractable sub-problems. Current techniques use a multilevel approach wherein an initial partitioning is performed after compressing the hypergraph to a predetermined level. This level is typically chosen to produce very coarse hypergraphs in which heuristic algorithms are fast and effective. This article presents a novel memetic algorithm which remains effective on larger initial hypergraphs. This enables the exploitation of information that can be lost during coarsening and results in improved final solution quality. We use this algorithm to present an empirical analysis of the space of possible initial hypergraphs in terms of its searchability at different levels of coarsening. We find that the best results arise at coarsening levels unique to each hypergraph. Based on this, we introduce an adaptive scheme that stops coarsening when the rate of information loss in a hypergraph becomes non-linear and show that this produces further improvements. The results show that we have identified a valuable role for evolutionary algorithms within the current state-of-the-art hypergraph partitioning framework.

Highlights

  • H YPERGRAPH partitioning (HGP) is an NP-hard problem [1] that occurs in many computer science applications where it is necessary to reduce large problems into a number of smaller, computationally tractable subproblems

  • We explore the use of evolutionary algorithms (EAs) to perform the initial partitioning within the state-of-the-art, open source (GPLv3), Karlsruhe n-level Hypergraph partitioning (HGP) framework, KaHyPar from https://github.com/SebastianSchlag/kahypar

  • To measure the performance of different algorithms across the full range of thresholds, we present the area under the curve (AUC) results, estimated from the experiments at individual thresholds using a composite Simpson’s rule

Read more

Summary

Introduction

H YPERGRAPH partitioning (HGP) is an NP-hard problem [1] that occurs in many computer science applications where it is necessary to reduce large problems into a number of smaller, computationally tractable subproblems. Common applications include very large scale integration (VLSI) design [2] and scientific computing [3]. Hypergraphs are a generalization of graphs where each hyperedge may connect more than two vertices. A hypergraph can be defined [4], [5] as H = {V, E, c, ω}, where: 1) V = {v1, . A hyperedge e ∈ E is said to be incident on a vertex v ∈ V if, and only if, v ∈ e. V ∈ V are said to be adjacent in Manuscript received March 25, 2018; revised October 6, 2018; accepted January 26, 2019.

Objectives
Methods
Findings
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.