Abstract
We investigate storage capacity of a fully connected layered neural network with Q(⩾2)-states clock neurons, including Q=∞ (corresponding to oscillatory neurons) and with intra-layer connections, where random Q-values patterns are embedded into the network by the Hebbian learning rule. We assume that both of inter-layered neurons and intra-layered neurons are updated simultaneously. We are able to treat a recurrent network, a feed-forward network and a layered network with the intra-layer connections within the same framework by adjusting a parameter. We analyze an energy function of the network by using the replica method. We clarify that the storage capacity of the layered network with the intra-layer connections is enhanced in comparison with that of the recurrent network and that of the feed-forward network for Q=2,3,4 and 5 at zero temperature; this is due to the competition between the inter-layer connections and the intra-layer connections. For ⩾6, including Q=∞, the storage capacity at zero temperature is maximum for the feed-forward network. We obtain phase diagrams in a load-parameter versus temperature plane for the layered network with the intra-layer connections.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Physica A: Statistical Mechanics and its Applications
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.