Abstract
We propose to use a biologically motivated learning rule based on neural intrinsic plasticity to optimize reservoirs of analog neurons. This rule is based on an information maximization principle, it is local in time and space and thus computationally efficient. We show experimentally that it can drive the neurons' output activities to approximate exponential distributions. Thereby it implements sparse codes in the reservoir. Because of its incremental nature, the intrinsic plasticity learning is well suited for joint application with the online backpropagation-decorrelation or the least mean squares reservoir learning, whose performance can be strongly improved. We further show that classical echo state regression can also benefit from reservoirs, which are pre-trained on the given input signal with the implicit plasticity rule.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.