Abstract

Function localization neural networks (FLNNs) are neural networks that have not only the capability of learning but also the capability of function localization. Function localization in the FLNNs improves the efficiency of individual neurons, and the FLNNs have better representation ability. However, a conventional backpropagation (BP) algorithm for a FLNN training is very easy to get stuck at a local minimum. The reason may be that an error function used for the training becomes complicated because the overlapping modules are switched according to input patterns. By statistical analysis of numerical simulation results, it has been found that there is a strong relation between local minimum problem and the variance of errors calculated for different modules. Based on the analysis result, this paper proposes an evaluation function combining the ordinary sum of squared errors (SSE) and the variance of module SSE, and applies it to a BP training. In this way, the BP training tries to reduce both the error of FLNN and the variance of module errors so as to avoid getting stuck at a local minimum. Numerical simulations are used to show the effectiveness of the proposed evaluation function.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call