Abstract
Introduction This study investigates integrating quantum-inspired learning models with traditional Hebbian learning within neural networks, comparing their performance in learning efficiency, generalization, stability, and robustness. Traditional Hebbian models are biologically plausible but often struggle with stability, scalability, and adaptability. In contrast, quantum-inspired models leverage quantum mechanics principles like superposition and entanglement to enhance neural network performance potentially. Methods The simulations were conducted using a neural network comprising 1,000 neurons and 100 patterns across 10 instances. The key parameters included a fixed decay rate of 0.005, 80% excitatory neurons, and 10% fixed connectivity. The study varied learning rates (0.01, 0.05, 0.1) and thresholds (0.3, 0.5, 0.7) to assess different parameter settings. The performance metrics evaluated included accuracy, precision, recall, and F1-Score. Results The results showed that quantum-inspired models achieved significantly higher accuracy and precision, enhancing their reliability in class prediction and reducing false positives. Conversely, Hebbian models excelled in recall and F1-Score, effectively identifying positive cases and balancing precision and recall. Additionally, quantum-inspired models demonstrated greater stability, robustness, and consistent performance across varying parameters. Conclusion Quantum-inspired models offer notable improvements in learning efficiency, generalization, stability, and robustness, while Hebbian models perform better in recall and F1-Score. These findings suggest the potential for hybrid models that combine the strengths of both approaches, aiming for more balanced and efficient learning systems. Future research should explore these hybrid models to enhance performance across diverse artificial intelligence applications. Supplementary materials include the complete R code used, enabling replication and further investigation of the results.
Highlights
This study investigates integrating quantum-inspired learning models with traditional Hebbian learning within neural networks, comparing their performance in learning efficiency, generalization, stability, and robustness
The study of neural networks has profoundly advanced the field of artificial intelligence, driven by a fundamental question: how can artificial systems emulate the processes of learning and memory seen in biological organisms? Central to this inquiry is the development of learning algorithms that enable networks to adapt, generalize, and execute complex tasks efficiently and precisely [1-5]
The quantum-inspired learning framework proposed in this study addresses the limitations of Hebbian learning by leveraging core principles of quantum mechanics [29]
Summary
This study investigates integrating quantum-inspired learning models with traditional Hebbian learning within neural networks, comparing their performance in learning efficiency, generalization, stability, and robustness. As computational demands have grown, the limitations of Hebbian learning — its scalability and adaptability—have become increasingly apparent [9] These constraints have inspired the search for innovative approaches, including integrating principles from quantum mechanics to redefine how neural networks learn and process information. Unlike conventional learning models, which rely on deterministic updates to synaptic weights, this framework incorporates probabilistic mechanisms that enable neural networks to explore multiple solutions simultaneously [15, 16]. This innovation enhances the efficiency and robustness of learning, addressing many of the challenges inherent in conventional methods while expanding the potential for future advancements in neural computation.
Paper version not known (
Free)
Published Version
Join us for a 30 min session where you can share your feedback and ask us any queries you have