Abstract

AbstractIn the application of layered neural networks to practical problems, a high generalization power is required. This paper discusses a method of improving the generalization power of neural networks. The knowledge of the object to be learned is assumed to include the fact that the output function of the object to be learned remains invariant for a certain range of input pattern variation. An attempt is made to improve the generalization power by reflecting this invariant property in the weighting of the neural network. It is shown for the case in which the variation of the input pattern can be represented by a linear transformation that a sufficient condition for the neural network to have such invariance is that the linear dependency constraint is introduced into the weight expression. A learning process is proposed in which this kind of constraint for the weight expression is introduced into the evaluation function as an additional term. The proposed method can be considered as a generalization of the method in which deletion learning methods such as the weight decay method and the structural learning method are included as special cases. There has been discussion of the relation between the generalization power and the VC dimension. The improvement of the proposed method can be evaluated, by introducing a linear constraint into the weight expression, based on the reduction of the VC dimension. Lastly, results are presented for an experiment in which the proposed method is applied to the character recognition problem. © 2003 Wiley Periodicals, Inc. Syst Comp Jpn, 34(14): 83–91, 2003; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/scj.1224

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call