Abstract

The training algorithm studied in this paper is inspired by the biological metaplasticity property of neurons. During the training phase, the Artificial Metaplasticity Learning Algorithm could be considered a new probabilistic version of the presynaptic rule, as during this phase the algorithm assigns higher values for updating the weights in the less probable activations than in the ones with higher probability. The algorithm is proposed for Artificial Neural Networks in general, although results at the moment have only been implemented and tested for Multilayer Perceptrons. Tested on different multidisciplinary applications, experiments show a much more efficient training, improving also Multilayer Perceptron results till the performance of the best systems in the state of the art, systems that usually are much more complex.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.