Abstract

We pose the problem of approximating optimally a given nonnegative signal with the scalar autoconvolution of a nonnegative signal. The I-divergence is chosen as the optimality criterion being well suited to incorporate nonnegativity constraints. After proving the existence of an optimal approximation, we derive an iterative descent algorithm of the alternating minimization type to find a minimizer. The algorithm is based on the lifting technique developed by Csiszár and Tusnádi and exploits the optimality properties of the related minimization problems in the larger space. We study the asymptotic behavior of the iterative algorithm and prove, among other results, that its limit points are Kuhn-Tucker points of the original minimization problem. Numerical experiments confirm the asymptotic results and exhibit the fast convergence of the proposed algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call