Abstract

Prototype-based models like the Generalized Learning Vector Quantization (GLVQ) belong to the class of interpretable classifiers. Moreover, quantum-inspired methods get more and more into focus in machine learning due to its potential efficient computing. Further, its interesting mathematical perspectives offer new ideas for alternative learning scenarios. This paper proposes a quantum computing-inspired variant of the prototype-based GLVQ for classification learning. We start considering kernelized GLVQ with real- and complex-valued kernels and their respective feature mapping. Thereafter, we explain how quantum space ideas could be integrated into a GLVQ using quantum bit vector space in the quantum state space {mathcal {H}}^{n} and show the relations to kernelized GLVQ. In particular, we explain the related feature mapping of data into the quantum state space {mathcal {H}}^{n}. A key feature for this approach is that {mathcal {H}}^{n} is an Hilbert space with particular inner product properties, which finally restrict the prototype adaptations to be unitary transformations. The resulting approach is denoted as Qu-GLVQ. We provide the mathematical framework and give exemplary numerical results.

Highlights

  • Classification learning still belongs to the main tasks in machine learning [5]

  • We give the basic mathematical properties of quantum state spaces and incorporating these concepts into Generalized Learning Vector Quantization (GLVQ). We show that this approach is mathematically consistent with the kernel GLVQ

  • The complex variant of Qu-GLVQ depends on both xÆ and wÆ according to (26) and, the angle vector update (27) for xÆ is accompanied by the respective update oLðjxi; W owÆ

Read more

Summary

Introduction

Classification learning still belongs to the main tasks in machine learning [5]. powerful methods are available, still there is need for improvements and search for alternatives to the existing strategies. The maybe improved performance of KGLVQ comes with a weaker interpretability, because the implicit kernel mapping does not allow to observe the mapped data directly in the feature space Another way to accelerate the usually time-consuming training process in machine learning models is to make use of efficient quantum computing algorithms [9, 12, 33]. The c-means algorithm is considered in [22, 69], which can be seen in connection to quantum classification algorithms based on competitive learning [71] In this contribution, we propose an alternative nonlinear data processing but somewhat related to kernel GLVQ keeping the idea to map the data nonlinearly into a particular Hilbert space. One goal is to show that both the quantum and the kernel approach are mathematically more or less equivalent while kernel approaches apply implicit mapping and the quantum approach does the mapping explicitly

GLVQ for real valued data
Complex variants of GLVQ
Quantum-inspired GLVQ
The real case
The complex case
Numerical experiments
Conclusion
Compliance with ethical standards
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call