Abstract

This thesis concerns some aspects of representation and learning in artificial neural networks. The representational issues that are dealt with here concern the concept of order, originally defined by Minsky and Papert. The relationship between the order of a neural network mapping problem and the required network fan-in is discussed, and a polynomial time algorithm to determine the order of a problem is presented. This algorithm also computes the weights required in networkswith a single layer of adjustable weights, including higher order networks and mask-perceptrons. A critical analysis of some of the work of Minsky and Papert, particularly that concerning the parity predicate, is also presented. Learning issues relating to the correct classification of all patterns in a pattern set are discussed. It has previously been recognized that minimizing an error function based on the l2 norm does not necessarily lead to the correct classification of the patterns in a pattern set even when this is possible for the network under consideration. Here it is proven that an error function based upon the l∞ norm can overcome this problem with l2 based error functions when a correctly classifying solution is desired. In addition, attention is drawn to the fact that any error function based on the l∞ norm is nonsmooth and so requires special techniques for minimization, a fact that was previously unrecognized. Finally, material is drawn from the field of nonsmooth optimization to obtain an algorithm for learning in networks with nonsmooth error functions and nonsmooth neural transfer functions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call