Abstract
Our ability to manipulate objects relies on tactile inputs from first-order tactile neurons that innervate the glabrous skin of the hand. The distal axon of these neurons branches in the skin and innervates many mechanoreceptors, yielding spatially-complex receptive fields. Here we show that synaptic integration across the complex signals from the first-order neuronal population could underlie human ability to accurately (< 3°) and rapidly process the orientation of edges moving across the fingertip. We first derive spiking models of human first-order tactile neurons that fit and predict responses to moving edges with high accuracy. We then use the model neurons in simulating the peripheral neuronal population that innervates a fingertip. We train classifiers performing synaptic integration across the neuronal population activity, and show that synaptic integration across first-order neurons can process edge orientations with high acuity and speed. In particular, our models suggest that integration of fast-decaying (AMPA-like) synaptic inputs within short timescales is critical for discriminating fine orientations, whereas integration of slow-decaying (NMDA-like) synaptic inputs supports discrimination of coarser orientations and maintains robustness over longer timescales. Taken together, our results provide new insight into the computations occurring in the earliest stages of the human tactile processing pathway and how they may be critical for supporting hand function.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.