Abstract

AbstractQuantifying finger kinematics can improve the authors’ understanding of finger function and facilitate the design of efficient prosthetic devices while also identifying movement disorders and assessing the impact of rehabilitation interventions. Here, the authors present a study that quantifies grasps depicted in taxonomies during selected Activities of Daily Living (ADL). A single participant held a series of standard objects using specific grasps which were used to train Convolutional Neural Networks (CNN) for each of the four fingers individually. The experiment also recorded hand manipulation of objects during ADL. Each set of ADL finger kinematic data was tested using the trained CNN, which identified and quantified the grasps required to accomplish each task. Certain grasps appeared more often depending on the finger studied, meaning that even though there are physiological interdependencies, fingers have a certain degree of autonomy in performing dexterity tasks. The identified and most frequent grasps agreed with the previously reported findings, but also highlighted that an individual might have specific dexterity needs which may vary with profession and age. The proposed method can be used to identify and quantify key grasps for finger/hand prostheses, to provide a more efficient solution that is practical in their day‐to‐day tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call