Abstract

Recent studies of the information capacity in a sparsely coded memory net has led to some contradictory results. In the Willshaw model, where the couplings are binary (0 or 1), the maximal quantity of information that can be stored is 1n 2 approximately=0.69 bits per synapse. On the other hand a calculation a la Gardner (1988) for (0,1) couplings gives an upper bound for the maximal capacity of about 0.29 bits per synapse. In this study, the author considers two possible sources for this discrepancy. The first one is that the criteria for defining the maximal capacity are different (with or without a constraint of perfect errorless storage). The second one is a difference in the choice of the probability distribution of the random patterns used to compute this capacity. This analysis shows in particular that for the Willshaw model the maximal information capacity is much larger when the number of active neurons is exactly the same in every stored pattern, than when it is given only in average. In addition he gives an argument showing that this result may be generic, e.g., valid for any activity level and independent of the learning rule.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call