Abstract

The method that the real-coded quantum-inspired genetic algorithm (RQGA) used to optimize the weights and threshold of BP neural network is proposed to overcome the defect that the gradient descent method makes the algorithm easily fall into local optimal value in the learning process. Quantum genetic algorithm (QGA) is with good directional global optimization ability, but the conventional QGA is based on binary coding; the speed of calculation is reduced by the coding and decoding processes. So, RQGA is introduced to explore the search space, and the improved varied learning rate is adopted to train the BP neural network. Simulation test shows that the proposed algorithm is effective to rapidly converge to the solution conformed to constraint conditions.

Highlights

  • Artificial neural networks (ANNs) are put forward to solve the nonlinear problem by simulating the operational process of nervous system

  • Sun et al establish a prediction model based on improved BP neural network and adopt it to investigate quantitative evolution laws of equiaxed α in near-β forging of TA15 Ti-alloy [4]

  • Xiao et al propose an approach of back propagation neural network with rough set for complicated short-term load forecasting with dynamic and nonlinear factors to develop the accuracy of predictions [5]

Read more

Summary

Introduction

Artificial neural networks (ANNs) are put forward to solve the nonlinear problem by simulating the operational process of nervous system. Experiential or statistical method is applied to determine the structure of BPNN, which is the optimal combination of the number of hidden layers, the number of hidden neurons, the choice of input factors, and parameters of the training algorithm [8]. The network growth/removal algorithm to add/remove neurons from the initial structure according to the predetermined standard to represent the effect of changes on the performance of the ANNs. The basic rule is to increase neurons when the training process is slow or the mean square deviation is greater than the specified value. Growth/removal algorithm is the basic gradient descent method which cannot guarantee convergence to the global minimum; the algorithm may fall into local optimal solution near to the initial point. Introducing RQGA to optimize the weights and thresholds of BPNN can guarantee getting better solution with higher probability

BP Neural Network
Real-Coded Quantum-Inspired Genetic Algorithm
The BPNN Based on RQGA
Case Analysis
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call