Abstract

The two issues on dynamically generated hierarchical neural networks such as the sort of basic neurons and how to compose a layer are considered in this article. On the first issue, a variant version of the least-square support vector regression (SVR) is chosen as a basic neuron. Support vector machine (SVM) is a representative classifier which usually shows good classification performance. Along with the SVMs, SVR was introduced to deal with the regression problem. Especially, least-square SVR has the advantages of high learning speed due to the substitution of the inequality constraints by the equality constraint in the formulation of the optimization problem. Based on the least-square SVR, the multiple least-square (MLS) SVR, which is a type of a linear combination of least-square SVRs with fuzzy clustering, is proposed to improve the modeling performance. In addition, a hierarchical neural network, where the MLS SVR is utilized as the generic node instead of the conventional polynomial, is developed. The key issues of hierarchical neural networks, which are generated dynamically layer by layer, are discussed on how to retain the diversity of the nodes located at the same layer according to the increase of the layer. In order to maintain the diversity of the nodes, various selection methods such as truncation selection and roulette wheel selection (RWS) to choose the nodes among candidate nodes are proposed. In addition, in order to reduce the computational overhead to determine all candidates which exhibit all compositions of the input variables, a new implementation method is proposed. From the viewpoint of the diversity of the selected nodes and the computational aspects, it is shown that the proposed method is preferred over the conventional design methodology.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call