Abstract

The recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs) that are homogenous only with a linear neuron model. As a heterogenous network model, ONNs are based on a generalized neuron model that can encapsulate any set of non-linear operators to boost diversity and to learn highly complex and multi-modal functions or spaces with minimal network complexity and training data. However, the default search method to find optimal operators in ONNs, the so-called Greedy Iterative Search (GIS) method, usually takes several training sessions to find a single operator set per layer. This is not only computationally demanding, also the network heterogeneity is limited since the same set of operators will then be used for all neurons in each layer. To address this deficiency and exploit a superior level of heterogeneity, in this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the “Synaptic Plasticity” paradigm that poses the essential learning theory in biological neurons. During training, each operator set in the library can be evaluated by their synaptic plasticity level, ranked from the worst to the best, and an “elite” ONN can then be configured using the top-ranked operator sets found at each hidden layer. Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs and as a result, the performance gap over the CNNs further widens.

Highlights

  • Neurons of a mammalian brain communicate with each other through synaptic connections [1] which control the ‘‘strength’’ of the signals transmitted between neurons via their individual neuro-chemical characteristics

  • The results clearly show that the elite and the worst Operational Neural Networks (ONNs) configured according to the top-S and bottom-S operator sets found during synaptic plasticity monitoring (SPM) obtain the best and the worst results

  • Synaptic plasticity is a natural process that enables the learning of a new ability, concept, or response to changes in the environmental stimuli

Read more

Summary

Introduction

Neurons of a mammalian brain communicate with each other through synaptic connections [1] which control the ‘‘strength’’ of the signals transmitted between neurons via their individual neuro-chemical characteristics. Traditional homogenous ANNs, e.g., Multi-layer Perceptrons (MLPs) [11, 12] with the linear neuron model can only approximate the ongoing learning process that is based on the responses of the training samples, and they are considered as the ‘‘Universal Approximators’’ This is perhaps the reason for the significant variations observed in their learning performance. The heterogeneity of the network increases with the number of operator sets used at each layer whereas the least heterogonous ONN can still be configured with a single operator set (the one with the highest HF ever achieved on that layer) assigned to all neurons in each layer In this way, we are able to exploit the role of network heterogeneity over the learning performance evaluated on three challenging problems: image denoising, synthesis and transformation.

Operational neural networks
Synaptic plasticity monitoring
Experimental Setup
SPM Results
Comparative Evaluations and Results
Conclusions
Findings
Compliance with ethical standards
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call