Abstract

The current generation of quantum computers calls for quantum algorithms that require a limited number of quantum gates and are resilient to noises. A suitable design strategy is variational circuits where parameters of circuits are determined through training, an approach that conforms with characteristics of machine learning applications. In this paper, we propose a low-depth and implementation-efficient non-linear activation function for quantum neural networks (QNNs). The building block of a quantum circuit is quantum gate which is a unitary operation. Thus, building a non-linear component out of quantum gates is challenging. While the majority of prior works used measurement as the source of non-linearity, this method has limited ability in classifying datasets. We propose a quantum circuit for the popular Rectified Linear Unit (ReLU) activation function. Our proposed circuit is based on low-cost quantum gates that can be synthesized into primitive gates in contemporary quantum computers. We exploit QNNs that rely on quantum rotation to define decision boundaries for classification problems. In addition, we use controlled quantum gates to detect correlation in data through entanglement of qubits. Our evaluations reveal that QNNs equipped with our proposed quantum ReLU perform well on standard benchmark datasets while requiring dramatically fewer number of epochs for training compared with classical neural networks. In addition, we run QNNs with different number of quantum layers on an IBM quantum computer and show that our proposed circuits are practical and generate meaningful results on real quantum computers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call