Abstract
We study how individual memory items are stored assuming that situations given in the environment can be represented in the form of synaptic-like couplings in recurrent neural networks. Previous numerical investigations have shown that specific architectures based on suppression or max units can successfully learn static or dynamic stimuli (situations). Here we provide a theoretical basis concerning the learning process convergence and the network response to a novel stimulus. We show that, besides learning "simple" static situations, a nD network can learn and replicate a sequence of up to n different vectors or frames. We find limits on the learning rate and show coupling matrices developing during training in different cases including expansion of the network into the case of nonlinear interunit coupling. Furthermore, we show that a specific coupling matrix provides low-pass-filter properties to the units, thus connecting networks constructed by static summation units with continuous-time networks. We also show under which conditions such networks can be used to perform arithmetic calculations by means of pattern completion.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.