Abstract

Divergence measures based on two entropy families are studied. One family contains the entropies of degree α and the second family embodies the entropies of order α. The latter entropies are also known as the Renyi entropies. Both types of divergence measures yield effective quality functions for guiding the growth and optimization of feed-forward neural networks built of linear threshold units. These functions are of particular value in the multi-category case. Important properties of these quality functions include their convexity on the domain of optimization and their greediness to split internal representations. As a consequence of these properties, these quality functions result in compact neural networks with good generalization properties. The suitability of some divergence measures to serve as a quality function is verified by a benchmark study. The divergence measures discussed in this paper are of great importance for the field of constructive learning.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.