Abstract

AbstractThis brief discusses various classes of polynomial optimization models, and our focus is to devise polynomial-time approximation algorithms with worst-case performance guarantees. These classes of problems include many frequently encountered constraint sets in the literature, such as the Euclidean ball, the Euclidean sphere, binary hypercube, hypercube, intersection of co-centered ellipsoids, a general convex compact set, and even a mixture of them. The objective functions range from multilinear tensor functions, homogeneous polynomials, to general inhomogeneous polynomials. Multilinear tensor function optimization plays a key role in the design of algorithms. For solving multilinear tensor optimization the main construction include the following inductive components. First, for the low order cases, such problems are typically either exactly solvable, or at least approximately solvable with an approximation ratio. Then, for a one-degree-higher problem, it is often possible to relax the problem into a polynomial optimization in lower degree, which is solvable by induction. The issue of how to recover a solution for the original (higher degree) polynomial optimization problem involves a carefully devised decomposition step.KeywordsPolynomial Optimization ModelsBinary HypercubeWorst-case Performance GuaranteeEuclidean SphereApproximation RatioThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call