Abstract

This paper considers the application of a " global" optimization scheme to the training of multilayer perceptions for signal classifications. This study is motivated by the fact that the error surface of a multilayer perceptron is a highly nonlinear function of the parameters. Therefore the backpropagation which is a gradient descent algorithm converges to locally minimum structures. As an example we consider a signal classification problem where the optimum classifier has been shown to have an exponential complexity and the optimum decision boundary to be nonlinear and nonconvex. In this example when standard backpropagation is used to train the weights of a multi-layer perception the network is shown to classify with a " linear" decision boundary which corresponds locally to a minima of the neural network configurations. In this paper we propose to enhance the learning process of the network by considering an optimization scheme referred to as simulated annealing. This optimization scheme has been proven to be effective in finding global minima in many applications. We derive an iterative training algorithm based on this " global" optimization technique using the backpropagation as the " local" optimizer. We will verify the effectiveness of the learning algorithm via an empirical analysis of two signal classification problems. 1 PRELIMINARIES Artificial Neural Networks are highly interconnected networks of relatively simple processing units (commonly referred to as nodes e. g. perceptrons) which operate in parallel.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call