Abstract

Abstract Artificial neural networks (ANN) proved to be efficient in solving many problems for big data analytics using machine learning. The complex and non-linear features of the input data can be learned and generalized by ANN. In the big data era, enormous amounts of data arrive from multiple sources. A stage is expected to be reached where even supercomputers are likely inundated with the big data. Training an ANN in such a situation is a challenging task due to the size and dimension of the big data. Also, a large number of parameters are to be used and optimized in the network to learn the patterns and analyze such data. Quantum computing is emerging as a field that provides a solution to this problem as a quantum computer can represent data differently using qubits. Qubits on quantum computers can be used to detect the hidden patterns in data that are difficult for a classical computer to find. Hence, there exists a huge scope for application in the area of artificial neural networks. In this work, we primarily focused on training an artificial neural network using qubits as artificial neurons. The simulation results show that our quantum computing approach for ANN (QC ANN) is efficient when compared to classical ANN. The model with qubits as artificial neurons can learn the features of data using fewer parameters for a binary classification task. We demonstrate our experiment using a quantum simulator and optimization of the quantum parameters used in QC ANN is carried out on a classical computer.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call