Abstract

Recently, the capabilities, lim itations, and applications of feedforward networks have been studied. One of the introductory papers is [4] by Lippmann. In this paper, on page 16, it is claimed that “No number of nodes, however, can separate the meshed class regions in F ig. 14 with a two-layer perceptron.” However, he has underest imated the ability of a two-layer feedforward network. The results of Hornik, Stinchcombe, and Wh ite [3] show that a two-layer perceptron can approximate any continuous function arbitrarily closely. An alternate proof of their result can be found in Blum and Li [ 11, as well as a basic result on three-layer networks. This note gives a proof of separation of arbitrary disjoint compact regions by two-layer McCulloch-Pitts(Mc-P) networks based on the theorem given in [3]. F irst, we demonstrate how a two-layer MC-P network separates the case in F ig. 14 of L ippmann [4]. Then, we extend our result to arbitrary compact decision regions by the theorem of Hornik, Stinchcombe, and Wh ite [3]. F inally, we give an example and discuss the lim itations of two-layer nets with noncompact regions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call