Abstract

Researches on neural population coding have revealed that continuous stimuli, such as orientation, moving direction, and the spatial location of objects could be encoded as continuous attractors in neural networks. The dynamical behaviors of continuous attractors are interesting properties of recurrent neural networks. This paper proposes a class of recurrent neural networks without lateral inhibition. Since there is no general rule to determine the stability of the network without specifying the excitatory connections, individual conditions can be calculated analytically for some particular cases. It shows that the networks can possess continuous attractors if the excitatory connections are in gaussian shape. Simulation examples are employed for illustration.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call