Abstract

Neural Networks are large systems of interconnected neurons or simple processors that perform relatively simple functions. The information in a neural network is stored in the interneural contacts and conveyed by patterns of neural states. It has been shown that, under certain computation and storage rules, A neural net work will perform as an associative memory; it will converge to the stored neural pattern closest to its initial state. However, for fully connected networks, where each neuron is connected to all the others, the storage conditions that guarantee such convergence are very restrictive and the storage capacity of the network is very low. It will be shown that neural networks having fractal connectivity patterns provide considerably better performance as associative memories and a vastly greater storage capacity than fully connected networks.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.