Abstract

While lacking previous knowledge, general users often have difficulty in using nonlinear statistical time series methods (e.g., bilinear model, threshold autoregressive model, smoothing transition autoregressive model, autoregressive conditional heteroscedastic model and generalized autoregressive conditional heteroscedastic model) and spectral analytical approaches. Moreover, standard back-propagation neural networks (BPNNs) using the steepest descent algorithm have the disadvantage that they converge to local optima. To overcome the above limitations, this work develops a self-organizing polynomial neural network (SOPNN) based on the group method of data handling (GMDH) algorithm, which is a statistical learning network. The proposed SOPNN scheme can create an optimal network topology by using an evolutionary process and is easily to be used. Performance of the proposed SOPNN scheme is also measured by using the Mackey-Glass chaotic time series problem. Additionally, numerical results of the proposed SOPNN scheme are compared with those of conventional BPNN, cerebellar model articulation controller NN (CMAC NN), advanced simulated annealing-based BPNN (ASA-BPNN) and artificial immune algorithm-based BPNN (AIA-BPNN). Experimental results indicate that the optimum training and test capabilities of the proposed SOPNN are superior to those of a conventional BPNN. Furthermore, the generalization capability and computational CPU time of the proposed SOPNN scheme are competitive for those of CMAC NN, AIA-BPNN and ASA-BPNN.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call