Abstract

We describe the construction and performance of `brain-based devices? (BBDs), physical devices whose behaviour is controlled by simulated nervous systems modelled on vertebrate neuroanatomy and neurophysiology, that carry out perceptual categorization and selective conditioning to visual and textural stimuli. BBDs take input from the environment through on-board sensors including cameras, microphones and artificial whiskers, and take action based on experiential learning. BBDs have a large-scale neural simulation, a phenotype, a body plan, and the means to learn through autonomous exploration. Key neural mechanisms in the present BBDs include synaptic plasticity, reward or value systems, reentrant connectivity, the dynamic synchronization of neuronal activity, and neuronal units with spatiotemporal response properties. With our BBDs, as with animals, it is the interaction of these neural mechanisms with the sensorimotor correlations generated by active sensing and self motion that is responsible for adaptive behaviour. BBDs permit analysis of activity at all levels of the nervous system during behaviour, and as such they provide a rich source of heuristics for generating hypotheses regarding brain function. Moreover, by taking inspiration from systems neuroscience, BBDs provide a novel architecture for the design of neuromorphic systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call