Abstract

Linear regression is a basic and widely-used methodology in data analysis. It is known that some quantum algorithms efficiently perform least squares linear regression of an exponentially large data set. However, if we obtain values of the regression coefficients as classical data, the complexity of the existing quantum algorithms can be larger than the classical method. This is because it depends strongly on the tolerance error $\epsilon$: the best one among the existing proposals is $O(\epsilon^{-2})$. In this paper, we propose the new quantum algorithm for linear regression, which has the complexity of $O(\epsilon^{-1})$ and keeps the logarithmic dependence on the number of data points $N_D$. In this method, we overcome bottleneck parts in the calculation, which take the form of the sum over data points and therefore have the complexity proportional to $N_D$, using quantum amplitude estimation, and other parts classically. Additionally, we generalize our method to some class of convex optimization problems.

Highlights

  • Following the rapid advance of quantum computing technology, many quantum algorithms have been proposed, and their applications to the wide range of practical problems have been studied in recent research

  • We consider a convex optimization problem where an objective function is written as sums of many terms to linear regression, and we present a quantum algorithm of Newton’s method, in which calculation of the gradient and the Hessian is sped up by quantum amplitude estimation (QAE)

  • We proposed a quantum algorithm for linear regression, or, more concretely, estimation of regression coefficients as classical data

Read more

Summary

INTRODUCTION

Following the rapid advance of quantum computing technology, many quantum algorithms have been proposed, and their applications to the wide range of practical problems have been studied in recent research. There are some quantum algorithms for linear regression [1,2,3,4,5,6] with complexity depending on the number of data points ND as O(polylog(ND)).1 This denotes the exponential speedup compared with the naive classical method explained in Sec. II B, whose complexity is proportional to ND. We consider a convex optimization problem where an objective function is written as sums of many terms to linear regression, and we present a quantum algorithm of Newton’s method, in which calculation of the gradient and the Hessian is sped up by QAE.

Notations and definitions
Linear regression
Quantum amplitude estimation
Assumptions
Details of our method
Time complexity
Linear regression as optimization by Newton’s method
Extension of our method
Convergence analysis of the QAE-based Newton’s method
Newton’s method based on classical Monte Carlo
Related previous works
SUMMARY
Proof of Lemma 3
Proof of Lemma 4
Findings
Proof of Lemma 5
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call