Abstract

The issue of granular output optimization of neural networks with fixed connections within a given input space is explored. The numeric output optimization is a highly nonlinear problem if nonlinear activation functions are used; the granular output optimization becomes an even more challenging task. We solve the problem by developing an optimal hierarchical allocation of information granularity, proposing a new objective function which considers both specificity and evidence, and engaging here efficient techniques of evolutionary optimization. In contrast to the existing techniques, the hierarchical one builds a three-level hierarchy to allocate information granularity to the input space and the architecture (parameters) of the network. Granulating both the input features and the architecture at the same time return a different result with the single factor granulation. The constructed granular neural network emphasizes the abstract nature of data and the granular nature of nonlinear mapping of the architecture. Experimental studies completed for synthetic data and publicly available data sets are used to realize the algorithm.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call