Abstract

Least squares regression is the simplest and most widely used technique for solving overdetermined systems of linear equations \(Ax = b\), where \(A\in \mathbb R^{n\times p}\) has full column rank and \(b\in \mathbb R^n\). Though there is a well known unique solution \(x^*\in \mathbb R^p\) to minimize the squared error \(\Vert Ax - b\Vert _2^2\), the best known classical algorithm to find \(x^*\) takes time \(\Omega (n)\), even for sparse and well-conditioned matrices \(A\), a fairly large class of input instances commonly seen in practice. In this paper, we design an efficient quantum algorithm to generate a quantum state proportional to \(| x^* \rangle \). The algorithm takes only \(O(\log n)\) time for sparse and well-conditioned \(A\). When the condition number of \(A\) is large, a canonical solution is to use regularization. We give efficient quantum algorithms for two regularized regression problems, including ridge regression and \(\delta \)-truncated SVD, with similar costs and solution approximation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call