Abstract
Our neurons seem capable of handling any type of data, regardless of its scale or statistical properties. In this letter, we suggest that optimal coding may occur at the single-neuron level without requiring memory, adaptation, or evolutionary-driven fit to the stimuli. We refer to a neural circuit as optimal if it maximizes the mutual information between its inputs and outputs. We show that often encountered differentiator neurons, or neurons that respond mainly to changes in the input, are capable of using all their information capacity when handling samples of any statistical distribution. We demonstrate this optimality using both analytical methods and simulations. In addition to demonstrating the simplicity and elegance of neural processing, this result might provide a way to improve the handling of data by artificial neural networks.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.