Abstract

Gardner's analysis (1989) of the optimal storage capacity of neural networks is extended to study finite-temperature effects. The typical volume of the space of interactions is calculated for strongly diluted networks as a function of the storage ratio alpha , temperature T and the tolerance parameter m, from which the optimal storage capacity alpha c is obtained as a function of T and m. At zero temperature it is found that alpha c=2 regardless of m while alpha c in general increases with the tolerance at finite temperatures. The authors show how the best performance for given alpha and T is obtained, which reveals a first-order transition from high-quality performance to a low-quality one at low temperatures. An approximate criterion for recalling, which is valid near m=1, is also discussed.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.