Abstract

We previously proposed to use complex numbers in neural networks to take into consideration the phase differences among impulse trains. In the present paper, we propose a complex associative memory in which input and output patterns are composed of the values 1 and − 1 and weights, and hence the membrane potentials, take complex values. The learning rule of the complex weights is a direct extension of that of the complex perceptron that we previously proposed. We also propose the “sigmoidal learning” for the real-valued associative memory, which replaces the signature function in the orthogonal learning rule with the sigmoid function that yields weight matrices different from those obtained by the generalized inverse method. The complex associative memory yielded better performance than would be predicted by the increase of the degree of freedom due to complexification. It also showed some interesting characteristics in its dynamics and in the distribution of its weights, which are quite distinguished from those of the real-valued counterpart.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.