Abstract

In order to accelerate the learning ability of neural network structure parameters and improve the prediction accuracy of deep learning algorithms, an evolutionary algorithm, based on a prior Gaussian mutation (PGM) operator, is proposed to optimize the structure parameters of a gated recurrent unit (GRU) neural network. In this algorithm, the sensitivity learning process of GRU model parameters into the Gaussian mutation operator, used the variance of the GRU model parameter training results as the Gaussian mutation variance to generate the optimal individual candidate set. Then, the optimal GRU neural network structure is constructed using the evolutionary algorithm of the prior Gaussian mutation operator. Moreover, the PGM-EA-GRU algorithm is applied to the prediction of stock market returns. Experiments show that the prediction model effectively overcomes the GRU neural network, quickly falling into a local optimum and slowly converging. Compared to the RF, SVR, RNN, LSTM, GRU, and EA-GRU benchmark models, the model significantly improves the searchability and prediction accuracy of the optimal network structure parameters. It also validates the effectiveness and the progressive nature of the PGM-EA-GRU model proposed in this paper with stock market return prediction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call