Abstract

Multi-Layer Neural Networks (MLNNs) have been known to be used to model the statistical properties of their training data. Several authors have shown that, depending on the objective function chosen, MLNNs estimate the posterior class probabilities of their inputs, provided the network is trained with binary desired outputs. It has recently been shown that conditions exist that define a general class of objective functions which provide probability estimates. This paper introduces a method of generating such objective functions. This generator is simple to use, and so far has been found to be universally applicable. Known objective functions, which include the mean-squared error (MSE) and the cross entropy (CE) measure, are generated here as examples of its application. To demonstrate the potential of this method a new objective function is derived and discussed. This work provides practising engineers with an explicit method for generating objective functions that could be used in their classification applications.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.