Abstract

Shoulder-controlled hand neuroprostheses are wearable devices designed to assist hand function in people with cervical spinal cord injury (SCI). They use preserved shoulder movements to control artificial actuators. Due to the concurrent afferent (i.e., shoulder proprioception) and visual (i.e., hand response) feedback, these wearables may affect the user's body somatosensory representation. To investigate this effect, we propose an experimental paradigm that uses immersive virtual reality (VR) environment to emulate the use of a shoulder-controlled hand neuroprostheses and an adapted version of a visual-tactile integration task (i.e., Crossmodal Congruency Task) as an assessment tool. Data from seven non-disabled participants validates the experimental setup, with preliminary statistical analysis revealing no significant difference across the means of VR and visual-tactile integration tasks. The results serve as a proof-of-concept for the proposed paradigm, paving the way for further research with improvements in the experimental design and a larger sample size.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call