Abstract

A neural network that retrieves stored binary vectors, when probed by possibly corrupted versions of them, is presented. It employs sparse ternary internal coding and autocorrelation (Hebbian) storage. It is symmetrically structured and, consequently, can be folded into a feedback configuration. Bounds on the network parameters are derived from probabilistic considerations. It is shown that when the input dimension is n, the proportional activation radius is /spl rho/ and the network size is 2/sup /spl nu/n/ with /spl nu/>1-h/sub 2/(/spl rho/), the equilibrium capacity is at least 2/sup /spl alpha/n//8n/spl rho/(1-/spl rho/) for any /spl alpha/<1-h/sub 2/(/spl rho/), where h/sub 2/(/spl middot/) is the binary entropy. A similar capacity bound is derived for the correction of errors of proportional size /spl rho/ or less, when /spl rho//spl les/0.3. The performance of a finite-size symmetric network is examined by simulation and found to exceed, at the cost of higher connectivity, that of the Kanerva (1988) model, operating as a content addressable memory.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call