Abstract
Based on the self-consistent signal-to-noise analysis (SCSNA) capable of dealing with analog neural networks with a wide class of transfer functions, enhancement of the storage capacity of associative memory and the related statistical properties of neural networks are studied for random memory patterns. Two types of transfer functions with the threshold parameter \ensuremath{\theta} are considered, which are derived from the sigmoidal one to represent the output of three-state neurons. Neural networks having a monotonically increasing transfer function ${\mathit{F}}^{\mathrm{M}}$, ${\mathit{F}}^{\mathrm{M}}$(u)=sgnu (\ensuremath{\Vert}u\ensuremath{\Vert}g\ensuremath{\theta}), ${\mathit{F}}^{\mathrm{M}}$(u)=0 (\ensuremath{\Vert}u\ensuremath{\Vert}\ensuremath{\le}\ensuremath{\theta}), are shown to make it impossible for the spin-glass state to coexist with retrieval states in a certain parameter region of \ensuremath{\theta} and \ensuremath{\alpha} (loading rate of memory patterns), implying the reduction of the number of spurious states. The behavior of the storage capacity with changing \ensuremath{\theta} is qualitatively the same as that of the Ising spin neural networks with varying temperature. On the other hand, the nonmonotonic transfer function ${\mathit{F}}^{\mathrm{NM}}$, ${\mathit{F}}^{\mathrm{NM}}$(u)=sgnu (\ensuremath{\Vert}u\ensuremath{\Vert}), ${\mathit{F}}^{\mathrm{NM}}$(u)=0 (\ensuremath{\Vert}u\ensuremath{\Vert}\ensuremath{\ge}\ensuremath{\theta}) gives rise to remarkable features in several respects.First, it yields a large enhancement of the storage capacity compared with the Amit-Gutfreund-Sompolinsky (AGS) value: with decreasing \ensuremath{\theta} from \ensuremath{\theta}=\ensuremath{\infty}, the storage capacity ${\mathrm{\ensuremath{\alpha}}}_{\mathit{c}}$ of such a network is increased from the AGS value (\ensuremath{\approxeq}0.14) to attain its maximum value of \ensuremath{\approxeq}0.42 at \ensuremath{\theta}\ensuremath{\simeq}0.7 and afterwards is decreased to vanish at \ensuremath{\theta}=0. Whereas for \ensuremath{\theta}\ensuremath{\gtrsim}1 the storage capacity ${\mathrm{\ensuremath{\alpha}}}_{\mathit{c}}$ coincides with the value ${\mathrm{\ensuremath{\alpha}}}_{\mathit{c}}$\ifmmode \tilde{}\else \~{}\fi{} determined by the SCSNA as the upper bound of \ensuremath{\alpha} ensuring the existence of retrieval solutions, for \ensuremath{\theta}\ensuremath{\lesssim}1 the ${\mathrm{\ensuremath{\alpha}}}_{\mathit{c}}$ is shown to differ from the ${\mathrm{\ensuremath{\alpha}}}_{\mathit{c}}$\ifmmode \tilde{}\else \~{}\fi{} with the result that the retrieval solutions claimed by the SCSNA are unstable for ${\mathrm{\ensuremath{\alpha}}}_{\mathit{c}}$${\mathrm{\ensuremath{\alpha}}}_{\mathit{c}}$\ifmmode \tilde{}\else \~{}\fi{}. Second, in the case of \ensuremath{\theta}1 the network can exhibit a new type of phase which appears as a result of a phase transition with respect to the non-Gaussian distribution of the local fields of neurons: the standard type of retrieval state with r\ensuremath{\ne}0 (i.e., finite width of the local field distribution), which is implied by the order-parameter equations of the SCSNA, disappears at a certain critical loading rate ${\mathrm{\ensuremath{\alpha}}}_{0}$, and for \ensuremath{\alpha}\ensuremath{\le}${\mathrm{\ensuremath{\alpha}}}_{0}$ a qualitatively different type of retrieval state comes into existence in which the width of the local field distribution vanishes (i.e., r=${0}^{+}$). As a consequence, memory retrieval without errors becomes possible even in the saturation limit \ensuremath{\alpha}\ensuremath{\ne}0. Results of the computer simulations on the statistical properties of the novel phase with \ensuremath{\alpha}\ensuremath{\le}${\mathrm{\ensuremath{\alpha}}}_{0}$ are shown to be in satisfactory agreement with the theoretical results. The effect of introducing self-couplings on the storage capacity is also analyzed for the two types of networks. It is conspicuous for the networks with ${\mathit{F}}^{\mathrm{NM}}$, where the self-couplings increase the stability of the retrieval solutions of the SCSNA with small values of \ensuremath{\theta}, leading to a remarkable enhancement of the storage capacity.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.