Abstract

We investigate asymptotic behaviour of synaptic matrix iterated according to the unlearning algorithm (A. Yu et al., 1994). The algorithm has been proven to converge to the projector (pseudo inverse) rule matrix if the unlearning strength parameter /spl epsi/>0 does not exceed some critical value. Asymptotic behaviour of normalized synaptic matrix J/spl tilde/ is considered, relating it to the corresponding spectrum dynamics. It is found that the algorithm converges for arbitrary value of /spl epsi/, and there are only three possibilities for limiting behaviour of J/spl tilde/. The first one is successful unlearning which implies the convergence to the projection matrix onto the linear subspace /spl Lscr/ spanned by maximal subset of linearly independent patterns. At sufficiently large values of /spl epsi/ the typical result of iterations will be failed unlearning, with J/spl tilde/ converging to the minus projector on random unity vector /spl xi//spl isin//spl Lscr/. We show that failed unlearning results in total memory breakdown. There is also an intermediate case when J/spl tilde/ converges to the projection matrix on some subspace of /spl Lscr/. Probability for different asymptotics to appear depending upon unlearning strength is studied for the case of unbiased random patterns. Retrieval properties of the system equipped with limiting synaptic matrix are also discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call