Abstract

min p,u1,...,us ‖f1−p·u1‖2+‖f2−p·u2‖2+· · ·+‖fs−p·us‖2 (1) where p, u1, . . . , us are polynomials such that deg(p) ≤ k, deg(p · ui) ≤ di = deg fi for 1 ≤ i ≤ s. Here ‖f‖2 denotes the norm of the coefficient vector of polynomial f . The minimization problem has many different formulations and various numeric optimization strategies have been proposed, see [1, 3] and references therein. In particular, Karmarkar and Lakshman [4] proposed an algorithm based on the global minimization of a rational function to compute the approximate GCD of univariate polynomials. The most expensive part of their algorithm is to find all the real solutions of two bivariate polynomials with high degrees. It has been shown in [8] that sum of squares (SOS) relaxation[5, 9] can be used to find the global minimum of the rational function arising from the approximate GCD computation. The SOS relaxation can be solved by reformulating them as semidefinite programs (SDPs), which in turn are solved efficiently by using interior point methods [7]. Motivated by the interesting results obtained in [8], we show how to apply SOS relaxation to solve different optimization problems formulated in [1, 3, 4]. Let fi,ui,p be coefficient vectors corresponding to polynomials fi, ui, p respectively, and Ai = Ai(p) be convolution ∗ This research was partially supported by NKBRPC (2004CB318000) and the Chinese National Natural Science Foundation under Grant 10401035 (Li and Zhi).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call