Abstract

Feedforward neural networks (FNNs) with a single hidden layer have been widely applied in data modeling due to its’ universal approximation capability to nonlinear maps. However, such a theoretical result does not provide with any guideline to determine the architecture of the model in practice. Thus, researches on self-organization of FNNs are useful and critical for effective data modeling. This paper proposes a hybrid constructing and pruning strategy (HCPS) for problem solving, where the mutual information (MI) and sensitivity analysis (SA) are employed to measure the amount of internal information of neurons at the hidden layer and the contribution rate of each hidden neuron, respectively. HCPS merges hidden neurons when their MI value becomes too high, deletes hidden neurons when their contribution rates are sufficiently small, and splits hidden neurons when their contribution rates are very big. For each instant pattern feed into the model as a training sample, the weights of the neural network will be updated to ensure the model's output unchanged during structural adjustment. HCPS aims to get a condensed model through eliminating redundant neurons and without degrading the instant modeling performance, which is associated with the model's generalization property. The proposed algorithm is evaluated by some benchmark data sets, including classification problems, a non-linear system identification problem, a time-series prediction problem, and a real world application for pM2.5 predictions. Simulation results with comparisons demonstrate that our proposed method performs favorably and has improved the existing work in terms of modeling performance.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.