Abstract

We evolve both topology and synaptic weights of recurrent very small spiking neural networks in the presence of noise on the membrane potential. The noise is at a level similar to the level observed in biological neurons. The task of the networks is to recognise three signals in a particular order (a pattern ABC) in a continuous input stream in which each signal occurs with the same probability. The networks consist of adaptive exponential integrate and fire neurons and are limited to either three or four interneurons and one output neuron, with recurrent and self-connections allowed only for interneurons. Our results show that spiking neural networks evolved in the presence of noise are robust to the change of neuronal parameters. We propose a procedure to approximate the range, specific for every neuronal parameter, from which the parameters can be sampled to preserve, at least for some networks, high true positive rate and low false discovery rate. After assigning the state of neurons to states of the network corresponding to states in a finite state transducer, we show that this simple but not trivial computational task of temporal pattern recognition can be accomplished in a variety of ways.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.