Abstract

A new feedforward architecture is presented for empirical model building and regression. The network consists of two hidden layers of units, where each unit utilizes a piece-wise linear activation function. A procedure for determining both the number of units and their connectivity is developed. The most notable feature of the network is its associated learning algorithm which allows for recursive updating of the parameters. A smoothness constraint is employed to limit the range of solutions, so that practical models may be built with small amounts of data. The network is applied to some function estimation tasks, as well as to a forecasting problem using data from the Santa Fe Institute time-series competition.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call