Neural codes have been postulated to build efficient representations of the external world. The hippocampus, an encoding system, employs neuronal firing rates and spike phases to encode external space. Although the biophysical origin of such codes is at a single neuronal level, the role of neural components in efficient coding is not understood. The complexity of this problem lies in the dimensionality of the parametric space encompassing neural components, and is amplified by the enormous biological heterogeneity observed in each parameter. A central question that spans encoding systems therefore is how neurons arrive at efficient codes in the face of widespread biological heterogeneities. To answer this, we developed a conductance-based spiking model for phase precession, a phase code of external space exhibited by hippocampal place cells. Our model accounted for several experimental observations on place cell firing and electrophysiology: the emergence of phase precession from exact spike timings of conductance-based models with neuron-specific ion channels and receptors; biological heterogeneities in neural components and excitability; the emergence of subthreshold voltage ramp, increased firing rate, enhanced theta power within the place field; a signature reduction in extracellular theta frequency compared to its intracellular counterpart; and experience-dependent asymmetry in firing-rate profile. We formulated phase-coding efficiency, using Shannon's information theory, as an information maximization problem with spike phase as the response and external space within a single place field as the stimulus. We employed an unbiased stochastic search spanning an 11-dimensional neural space, involving thousands of iterations that accounted for the biophysical richness and neuron-to-neuron heterogeneities. We found a small subset of models that exhibited efficient spatial information transfer through the phase code, and investigated the distinguishing features of this subpopulation at the parametric and functional scales. At the parametric scale, which spans the molecular components that defined the neuron, several nonunique parametric combinations with weak pairwise correlations yielded models with similar high phase-coding efficiency. Importantly, placing additional constraints on these models in terms of matching other aspects of hippocampal neural responses did not hamper parametric degeneracy. We provide quantitative evidence demonstrating this parametric degeneracy to be a consequence of a many-to-one relationship between the different parameters and phase-coding efficiency. At the functional scale, involving the cellular-scale neural properties, our analyses revealed an important higher-order constraint that was exclusive to models exhibiting efficient phase coding. Specifically, we found a counterbalancing negative correlation between neuronal gain and the strength of external synaptic inputs as a critical functional constraint for the emergence of efficient phase coding. These observations implicate intrinsic neural properties as important contributors in effectuating such counterbalance, which can be achieved by recruiting nonunique parametric combinations. Finally, we show that a change in afferent statistics, manifesting as input asymmetry onto these neuronal models, induced an adaptive shift in the phase code that preserved its efficiency. Together, our analyses unveil parametric degeneracy as a mechanism to harness widespread neuron-to-neuron heterogeneity towards accomplishing stable and efficient encoding, provided specific higher-order functional constraints on the relationship of neural gain to external inputs are satisfied.
Read full abstract