Abstract

Today's quantum processors composed of fifty or more qubits have allowed us to enter a computational era where the output results are not easily simulatable on the world's biggest supercomputers. What we have not seen yet, however, is whether or not the complexity achievable with such quantum processors would ever be useful for any practical applications. In the quantum neural network approach, there has been an expectation that the Hilbert space can serve as a computational resource by providing a large feature space; however, it is not understood, nor illustrated in what way the Hilbert space provides that computational power. In this work we introduce a resource-efficient quantum neural network model with a focus on the role of the feature space the quantum processors can give. Then we apply it for classification tasks, showing a quantum advantage in terms of the physical resources needed to realize the neural network. To facilitate the Hilbert space as a large feature space, we utilize scale-free networks that can be generated in a discrete-time crystal model. The virtue of this approach is in both its theoretical simplicity and its practical applicability: our quantum neural network model is simple enough to identify the role of the Hilbert space as the feature space, and, furthermore, can be implemented with current technology. Our model does not require optimization of the quantum processor, and hence once the quantum processor has been set, it can be used for other different classification problems without any changes in the quantum processor itself.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call