Abstract

The information capacity is investigated for two classes of neural network models using the outer-product (hebbian) storage rule. A generalization of Hopfield-type models using higher-order interactions is analyzed, as well as a similar generalization of a three-layer network that uses hebbian learning among the second and third layer. It is shown that the total information stored in these systems is a constant times the number connections in the network, independent of the particular model, the order of the model, or whether clipped weights are used or not.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call