Neocortical pyramidal neurons have many dendrites, and such dendrites are capable of, in isolation of one-another, generating a neuronal spike. It is also now understood that there is a large amount of dendritic growth during the first years of a humans life, arguably a period of prodigious learning. These observations inspire the construction of a local, stochastic algorithm based on an earlier stochastic, homeostatic, Hebbian developmental theory. Here we investigate the neurocomputational advantages and limits on this novel algorithm that combines dendritogenesis with supervised adaptive synaptogenesis. Neurons created with this algorithm have enhanced memory capacity, can avoid catastrophic interference (forgetting), and have the ability to unmix mixture distributions. In particular, individual dendrites develop within each class, in an unsupervised manner, to become feature-clusters that correspond to the mixing elements of class-conditional mixture distribution. Error-free classification is demonstrated with input perturbations up to 40%. Although discriminative problems are used to understand the capabilities of the stochastic algorithm and the neuronal connectivity it produces, the algorithm is in the generative class, it thus seems ideal for decisions that require generalization, i.e., extrapolation beyond previous learning.
Read full abstract