Abstract
A trainable analog restricted Hopfield Network is presented in this paper. It consists of two layers of nodes, visible and hidden nodes, connected by weighted directional paths forming a bipartite graph with no intralayer connection. An energy or Lyapunov function was derived to show that the proposed network will converge to stable states. The proposed network can be trained using either the modified SPSA or BPTT algorithms to ensure that all the weights are symmetric. Simulation results show that the presence of hidden nodes increases the network’s memory capacity. Using EXOR as an example, the network can be trained to be a dynamic classifier. Using A, U, T, S as training characters, the network was trained to be an associative memory. Simulation results show that the network can perform perfect re-creation of noisy images. Its recreation performance has higher noise tolerance than the standard Hopfield Network and the Restricted Boltzmann Machine. Simulation results also illustrate the importance of feedback iteration in implementing associative memory to re-create from noisy images.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.