Abstract

Google used 10 million natural images as input information and performed self-organized learning with a huge neural network with 10 billion synapses, and neurons with a receptive field resembling a cat’s image appeared in the upper layer. Hokusai drew “Great Wave” by using his memory with a fractal structure. Which do you think is “beautiful”: “Google’s cat picture” and Hokusai’s “Great Wave”? I think Hokusai’s one is beautiful. Because it is based on stunning information compression. The proposed network in this paper is composed of a one-layer artificial neural network with feedforward and feedback connections. In the feedforward connections, the spatiotemporal learning rule (STLR) Tsukada et al. (1994, 1996) has high ability in pattern separation and in the recurrent connections, Hebbian learning rule (HEB) in pattern completion. The interaction between the two rules plays an important role to self-organize the context-dependent attractor in the memory network. The context-dependent attractors depend on the balance between STLR and HEB. The structure is an important factor of memory networks to hierarchically embed a sequence of events.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.