Abstract

Problem statement: Generalization feature enhancement of neural networks, especially feed forward structural model has limited progress. The major reason behind such limitation is attributed to the principal definition and the inability to interpret it into convenient structure. Traditional schemes, unfortunately have regarded generalization as an innate outcome of the simple association, referred to by Pavlov and had been modeled by piaget as the basis of assimilating conduct. Approach: A new generalization approach based on the addition of a supportive layer to the traditional neural network scheme (atomic scheme) was presented. This approach extended the signal propagation of the whole net in order to generate the output in two modes, one deals with the required output of trained patterns with predefined settings, while the other tolerates output generation dynamically with tuning capability for any newly applied input. Results: Experiments and analysis showed that the new approach is not only simpler and easier, but also is very effective as the proportions promoting the generalization ability of neural networks have reached over 90% for some cases. Conclusion: Expanding neuron as the generalization essential construction denoted the accommodating capabilities involving all the innate structures in conjugation with Intelligence abilities and with the needs of further advanced learning phases. Cogent results were attained in comparison with that of the traditional schemes.

Highlights

  • Generalization ability of Neural Networks (NNs) is considered as the most important performance criterion[1]

  • A neural network should learn a relation from limited data and properly respond to unseen input[8], it is impossible for NNs to solve all the problems by learning from limited examples and developing new methods for improving NNs’ generalization ability is highly needed

  • Through the literature of the developed models, The current paper proposes a modified structure generalization is envisaged as an intuitive and as side based on Pavlov and Piaget theorems[11,12] in order to effect of the connection schemes

Read more

Summary

INTRODUCTION

Generalization ability of Neural Networks (NNs) is considered as the most important performance criterion[1]. It is designed to merge both into a learned process rather than being passive behavior of an classical and generalization learning characteristics association scheme This might address the major simultaneously in one network simulating human obstacle stands behind improving the generalization conduct in relation with responses of the different capability of the traditional connection schemes where mental activities adopted for various levels of timing generalization enhancement had been attributed to data consideration in output generation. The major association attributes of the supporting layer denotes the weight values of the connections needed to link the band selector neuron to its neurons and they are committed to the second cycle of model training.

RESULTS
Findings
DISCUSSION
CONCLUSION

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.