Abstract

In the last few years, quantum computing and machine learning fostered rapid developments in their respective areas of application, introducing new perspectives on how information processing systems can be realized and programmed. The rapidly growing field of Quantum Machine Learning aims at bringing together these two ongoing revolutions. Here we first review a series of recent works describing the implementation of artificial neurons and feed-forward neural networks on quantum processors. We then present an original realization of efficient individual quantum nodes based on variational unsampling protocols. We investigate different learning strategies involving global and local layer-wise cost functions, and we assess their performances also in the presence of statistical measurement noise. While keeping full compatibility with the overall memory-efficient feed-forward architecture, our constructions effectively reduce the quantum circuit depth required to determine the activation probability of single neurons upon input of the relevant data-encoding quantum states. This suggests a viable approach towards the use of quantum neural networks for pattern classification on near-term quantum hardware.

Highlights

  • In classical machine learning, artificial neurons and neural networks were originally proposed, more than a half century ago, as trainable algorithms for classification and pattern recognition [1], [2]

  • Among the most relevant results obtained in quantum machine learning, it is worth mentioning the use of trainable parameterized digital and continuousvariable quantum circuits as a model for quantum neural networks [12]–[21], the realization of quantum support vector machines [22] working in quantum-enhanced feature spaces [23], [24], and the introduction of quantum versions of artificial neuron models [25]–[32]

  • In this article, we reviewed an exact model for the implementation of artificial neurons on a quantum processor, and we introduced variational training methods for efficiently handling the manipulation of classical and quantum input data

Read more

Summary

INTRODUCTION

Artificial neurons and neural networks were originally proposed, more than a half century ago, as trainable algorithms for classification and pattern recognition [1], [2]. Several attempts have been made to link these powerful but computationally intensive applications to the rapidly growing field of quantum computing; see [8] for a useful review The latter holds the promise to achieve relevant advantages with respect to classical machines already in the near term, at least on selected tasks including, e.g., chemistry calculations [9], [10], classification, and optimization problems [11]. We review a recently proposed quantum algorithm implementing the activity of binary-valued artificial neurons for classification purposes Formally exact, this algorithm in general requires quite large circuit depth for the analysis of the input classical data. By combining memory-efficient encoding schemes and lowdepth quantum circuits for the manipulation and analysis of quantum states, the proposed methods, currently at an early stage of investigation, suggest a practical route toward problem-specific instances of quantum computational advantage in machine learning applications

MODEL OF QUANTUM ARTIFICIAL NEURONS
GLOBAL VARIATIONAL TRAINING
LOCAL VARIATIONAL TRAINING
CASE STUDY
CONCLUSION

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.