Abstract

This article supports the relevance of modeling new bioinspired properties in rate-coding artificial neurons, focusing on fundamental neural properties rarely implemented thus far in artificial neurons, such as intrinsic plasticity, the metaplasticity of synaptic strength, and the lateral inhibition of neighborhood neurons. All these properties are bioinspired through empirical models developed by neurologists, and this in turn contributes to taking perceptrons to a higher potential level. Metaplasticity and intrinsic plasticity are different levels of plasticity and are believed by neurologists to have fundamental roles in memory and learning and therefore in the performance of neurons. Assuming that information about stimuli is contained in the firing rate of the connections among biological neurons, several models of artificial implementation have been tested. Analyzing their results and comparing them with learning and performance of state-of-the-art models, relevant advances are made in the context of the developing Industrial Revolution 4.0 based on advances in Machine Learning, and they may even initiate a new generation of artificial neural networks. As an example, a single-layer perceptron that includes the proposed advances is successfully trained to perform the XOR function, called the Competitive Perceptron, which is a new bioinspired artificial neuronal model with the potential of non-linear separability, continuous learning, and scalability, which is suitable to build efficient Deep Networks, overcoming the basic limitations of traditional perceptrons that have challenged scientists for half a century.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call