Abstract

Incremental random weight networks (IRWNs) face the issues of weak generalization and complicated network structure. There is an important reason: the learning parameters of IRWNs are determined in a random fashion without guidance, which may increase many redundant hidden nodes, and thereby leading to inferior performance. To resolve this issue, a novel IRWN with compact constraint that guides the assignment of random learning parameters (CCIRWN) is developed in this brief. Using the iteration method of Greville, a compact constraint that simultaneously assures the quality of generated hidden nodes and the convergence of the CCIRWN is built to perform learning parameter configuration. Meanwhile, the output weights of the CCIRWN are assessed analytically. Two types of learning methods for constructing the CCIRWN are proposed. Finally, the performance evaluation of the proposed CCIRWN is undertaken on the 1-D nonlinear function approximation, several real-world datasets, and data-driven estimation based on the industrial data. Numerical and industrial examples indicate that the proposed CCIRWN with compact structure can achieve favorable generalization ability.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call