Abstract

We present a case study in which multilayer feedforward networks are developed to compute probability distributions corresponding to given input patterns. Normalized squashing functions are used for the output neurons, and the Kullback-Leibler relative entropy is adopted as a measure of the departure of computed output distributions from their targets. The evolution of weights is governed by a stochastic back-propagation algorithm based on the entropic cost function. To improve generalization, cycles of pruning and retraining are implemented. The development is framed in terms of the concrete problem of learning and prediction of the systematics of stability and decay of nuclear ground states. For a given input nuclide, characterized by its proton and neutron numbers, a network is required to generate the associated probability distribution over the options of stability and four different modes of decay. With training and test sets provided by the Brookhaven nuclear data facility, a variety of feedforward architectures have been explored, yielding a number of models that demonstrate high quality of performance both in learning and prediction. The nature of the underlying physical problem is such that it would be very difficult to achieve this quality with a global model based on conventional nuclear theory. The work is introduced by a brief survey of other scientific applications of neural networks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.