Abstract
To enhance the approximation ability of neural networks, by introducing quantum rotation gates to the traditional BP networks, a novel quantum-inspired neural network model is proposed in this paper. In our model, the hidden layer consists of quantum neurons. Each quantum neuron carries a group of quantum rotation gates which are used to update the quantum weights. Both input and output layer are composed of the traditional neurons. By employing the back propagation algorithm, the training algorithms are designed. Simulation-based experiments using two application examples of pattern recognition and function approximation, respectively, illustrate the availability of the proposed model.
Highlights
In 1980s, Benioff firstly proposed the concept of quantum computation [1]
The hybrid QINN (HQINN) is the amalgamation of quantum computing and nerve computing, which holds the advantage such as parallelism and high efficiency of quantum computation besides continuity, approximation capability, and generalization capability that the classical ANN holds
In HQINN, the weights are represented by qubits, and the phase of each qubit is modified by the quantum rotation gate
Summary
In 1980s, Benioff firstly proposed the concept of quantum computation [1]. Shor discussed the first quantum algorithm of very large integer factorization [2] in 1994. [7] proposed the model of quantum neural networks with multi-level hidden neurons based on the superposition of quantum states in the quantum theory. [9] proposed a weightless model based on quantum circuit It is quantum-inspired but is a quantum NN. This model is based on Grover’s search algorithm, and it can perform both quantum learning and simulate the classical models. [11] have proposed a quantum BP neural networks model with learning algorithm based on the single-qubit rotation gates and two-qubit controlled-NOT gates. We study a new hybrid quantum-inspired neural networks model with quantum weights and real weights. Three application examples demonstrate that this quantum-inspired neural network is superior to the classical BP networks
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.