Abstract

In this paper, a self-developing neural network model, namely the Growing Cell Structures (GCS) is characterized. In GCS each node (or cell) is associated with a local resource counter τ (t). We show that GCS has the conservation property by which the summation of all resource counters always equalss(1 - α)/α , thereby s is the increment added to τ (t) of the wining node after each input presentation and α (0 < α < 1.0) is the forgetting (i.e., decay) factor applied to τ (t) of non-wining nodes. The conservation property provides an insight into how GCS can maximize information entropy. The property is also employed to unveil the chain-reaction effect and race-condition which can greatly influence the performance of GCS. We show that GCS can perform better in terms of equi-probable criterion if the resource counters are decayed by a smaller α.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call