Abstract

We investigate the pattern completion performance of neural auto-associative memories composed of binary threshold neurons for sparsely coded binary memory patterns. By focussing on iterative retrieval, we are able to introduce effective threshold control strategies. These are investigated by means of computer simulation experiments and analytical treatment. To evaluate the systems performance we consider the completion capacity C and the mean retrieval errors. The asymptotic completion capacity values for the recall of sparsely coded binary patterns in one-step retrieval is known to be ln 2 4 ≈ 17.32% for binary Hebbian learning, and 1 (8 ln 2) ≈ 18% for additive Hebbian learning. These values are accomplished with vanishing error probability and yet are higher than those obtained in other known neural memory models. Recent investigations on binary Hebbian learning have proved that iterative retrieval as a more refined retrieval method does not improve the asymptotic completion capacity of one step retrieval. In a finite size auto-associative memory we show that iterative retrieval achieves higher capacity and better error correction than one-step retrieval. One-step retrieval produces high retrieval errors at optimal memory load. Iterative retrieval reduces the retrieval errors within a few iteration steps (t ⩽ 5). Experiments with additive Hebbian learning show that in the finite model, binary Hebbian learning exhibits much better performance. Thus the main concern of this paper is binary Hebbian learning. We examine iterative retrieval in experiments with up to n = 20,000 threshold neurons. With this system size one-step retrieval yields a completion capacity of about 16%, the second retrieval step increases this value to 17.9% and with iterative retrieval we obtain 19%. The first two retrieval steps in the finite system have also been treated analytically. For one-step retrieval the asymptotic capacity value is approximated from below with growing system size. In the second retrieval step (and as the experiments suggest also for iterative retrieval) the finite size behaviour is different. The capacity exceeds the asymptotic value, reaches an optimum for finite system size, and decreases to the asymptotic limit.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.