Abstract

We propose a measure of information capacity in analog neural networks which generate spatio-temporal oscillations of limit cycle in response to oscillatory or constant inputs. The proposed measure of information capacity is defined in terms of the normalized volume of output realization in a multidimensional hyper-space of frequency spectrum, and the normalization constant is the volume of input realization in the same space. Using oscillatory inputs, we estimated the proposed capacity in a specific network composed of two neurons which are mutually coupled with inhibitory connections. The induced oscillation was characterized by limit cycles synthesized from at least two waveforms of oscillatory neurons. The estimated information capacity represented nearly equal amount of realization as in input's. Those neural networks are expected to allow huge information capacity because of the property of analog changes of limit cycle outputs (i.e. waveform, frequency etc.) to the change of inputs (i.e. frequency, amplitude, etc.).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call