Abstract

The paper introduces a new extension of the ontogenic Self-Optimizing Neural Networks (SONNs)[4] making possible to optimize a neural network (NN) topology for a whole training data (TR) set at once. The classical SONNs optimize a NN topology only for subnetworks related to trained classes. The described SONN extension enables to optimize topology for all classes at once. Moreover, this extension makes possible to compute a minimal SONN topology for given TD which can be sometimes insufficient in view of generalization. The SONN extension computes better discrimination coefficients and automatically develops the topology that reflects all well-discriminative data features into the NN topology in order to achieve a good generalization property. Furthermore, the SONN extension can also automatically reduce the input dimension space of any TD and automatically recognize and correctly classify inverted inputs (especially important for image classification). All extended SONN computations are fully automatic and deterministic. There is no need to use any parameters given by user. The SONNs are free from many training problems, e.i. initiation, convergence, overfitting. The extended SONNs can be also used to unsupervised training.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call