Abstract
Abstract Metaplasticity property of biological synapses is interpreted in this paper as the concept of placing greater emphasis on training patterns that are less frequent. A novel implementation is proposed in which, during the network learning phase, a priority is given to weight updating of less frequent activations over the more frequent ones. Modeling this interpretation in the training phase, the hypothesis of an improved training is tested on the Multilayer Perceptron type network with Backpropagation training. The results obtained for the chosen application show a much more efficient training, while at least maintaining the Multilayer Perceptron performance.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.