Abstract

Statistical parameters that can be used to estimate the convergence probability of arbitrary-order Hebbian-type neural network associative memories (HAMs) with N neurons and M stored patterns are developed. The principle involves using two figures of merit, ε/e and e N , to determine the convergence probability for indirect (iterative) convergence and direct (one-step) convergence HAMs. Given e, the probability that a neuron changes to an incorrect bit after one update, the parameter ε/e determines the capability of converging iteratively to at most ε N bits away from the stored vector after a stable state is reached, where 0lεl0.5. It is shown that the indirect convergence probability P ica1.0 for all HAMs having ε/eg20. If precise convergence to the stored vector is required in one step, the parameter e N is used to determine the probability of direct convergence, P dc

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call