Abstract
The chapter starts by reviewing relevant classical machine learning algorithms, categorized as supervised, unsupervised, and reinforcement learning algorithms. The following topics from classical machine learning are described: principal component analysis (PCA), support vector machines (SVMs), clustering, boosting, regression analysis, and neural networks. In the PCA section we describe how to determine the principal components from the correlation matrix, followed by a description of singular value decomposition-based PCA and scoring phase. We describe SVMs from both geometric and Lagrangian method-based points of view and introduce both hard and soft margins as well as the kernel method. In the clustering section we describe K-means, expectation maximization, and K-nearest neighbor algorithms. We also describe how to evaluate the clustering quality. In the boosting section, we describe how to build strong learners from weak learners, with special attention devoted to the AdaBoost algorithm. In the regression analysis section we describe the least-squares estimation, the pseudoinverse approach, and the ridge regression method. In the neural networks' section we describe in detail: perceptron, activation functions, and feedforward networks. The focus then moves to the quantum machine learning (QML) algorithms. We first describe the Ising model and relate it to the quadratic unconstrained binary optimization (QUBO) problem. We then study how to solve the QUBO problem by adiabatic quantum computing and quantum annealing. To perform QML using imperfect and noisy quantum circuits we describe the variational quantum eigensolver and quantum approximate optimization algorithm (QAOA). To illustrate the impact of QAOA we describe how to use it in combinatorial optimization problems and how to solve the MAX-CUT problem. Next, quantum boosting is discussed, and is related to the QUBO problem. Quantum random access memory is described next, allowing us to address the superposition of memory cells with the help of a quantum register. Quantum matrix inversion, also known as the Harrow–Hassidim–Lloyd algorithm, is then described, which is used as a basic ingredient for other QML algorithms such as quantum PCA, which is described as well. In the quantum optimization-based clustering section we describe how the MAX-CUT problem can be related to clustering, and thus be solved by adiabatic computing, quantum annealing, and QAOA. Grover algorithm-based quantum optimization is discussed next. In the quantum K-means section we describe how to calculate the dot product and quantum distance, followed by the Grover search-based K-means algorithm. In the quantum SVM section we formulate the SVM problem using least squares and describe how to solve it using quantum matrix inversion. In the quantum neural networks (QNNs) section we describe feedforward QNNs, quantum perceptron, and quantum convolutional networks.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Quantum Information Processing, Quantum Computing, and Quantum Error Correction
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.