Abstract

Performance of neuronal population coding is investigated numerically, in neurons with Gaussian tuning functions of various widths and noise ratios. The present model is applicable to both direction coding and orientation coding. It is shown that the coding error exhibits peculiar dependence on the width of the tuning function and that the dependence under the influence of noise is different from that of the noise-free case. In the absence of noise, the coding error increases monotonically with the width of the tuning function. The increment obeys the power law (the exponent estimated is 0.501) when the width is less than the critical value. In this region of the width a scaling law is obtained, which shows that the root-mean-square error is proportional to the square root of the ratio of the width of the tuning function to the population size. When the width exceeds the critical value, the coding error increases more rapidly than the power law. The reason for this `anomalous increase', not seen previously, is argued. Existence of noise changes the dependence of the coding error on the width of the tuning function. Unlike the noise-free case, the error under the influence of noise becomes minimum at an intermediate value of the width. The width that gives the minimum coding error is termed the optimum width in this article. The numerical results suggest that the optimum width is roughly proportional to the square root of the noise ratio but has only a weak dependence on the population size. It is further shown that the coding error for the optimum width increases sharply when the noise ratio exceeds about 0.5 and is inversely proportional to the square root of the population size.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call