Abstract

The performance of neural networks is determined by their capacity and generalization ability or robustness to noise. Generalization is the property of trained neural networks to classify an input correctly even if it is not a member of the training set. The capacity of a neural network is determined by the amount of information that can be reliably stored in the network. The performance of neural networks depends on their architecture and the learning scheme employed for their training. The performance of single-layered neural networks trained using the outer-product rule has been extensively studied in recent years (Amari and Maginu 1988; McEliece et al. 1987). According to these analytical studies, neural networks trained using the outer-product rule are characterized by low capacity and low generalization ability. The real efficiency of such networks in practical applications is expected to be even lower than predicted, due to the simplifying assumptions made in these analytical studies. The performance evaluation of optimally trained neural networks has not attracted attention proportional to their potential, probably because of the analytical difficulties involved. Nevertheless, the performance evaluation of optimally trained neural networks provides a reliable upper bound on operational efficiency that can be attained by single-layered neural networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call