Abstract

Multivariate polynomial optimization problems are ubiquitous in signal processing, machine learning, and artificial intelligence. Examples include, but are not limited to, blind source separation, classification, multivariate polynomial regression, and tensor eigenvalue problems. Efficient algorithms for these problems have been studied in the case where the problem can be written as a best low rank tensor approximation. In the same spirit, we aim to extend these algorithms to a larger class of cost functions by representing multivariate polynomials using compact tensor models. This tensor-based multivariate polynomial optimization framework will allow us to tackle a broader range of problems than what is possible with existing methods. In this paper, we focus on the symmetric CPD format for representing multivariate polynomials, and show that exploiting this structure results in efficient numerical optimization-based algorithms. We demonstrate our approach for the blind deconvolution of constant modulus signals, outperforming state-of-the-art algorithms in computational time while maintaining similar accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call