Abstract

In the training process of Neural networks (NNs), the selection of hyper-parameters is crucial, which determines the final training effect of the model. Among them, the Learning rate decay (LRD) can improve the learning speed and accuracy; the Weight decay (WD) improves the over-fitting to varying degrees. However, the decay methods still have problems such as hysteresis and stiffness of parameter adjustment, so that the final model will be inferior. Based on the Quantum contextuality (QC) theory, we propose a Quantum contextuality constraint (QCC) to constrain the weights of nodes in NNs to further improve the training effect. In the simplest classification model, we combine this constraint with different methods of LRD and WD to verify that QCC can further improve the training effect on the decay method. The performance of the experiments shows that QCC can significantly improve the convergence and accuracy of the model.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.