Abstract

Unlike sign language, which usually involves large-scale movements to form a gesture, finger language, suitable for handicapped aphasiacs, is represented by relatively small-scale hand gestures accessible by a mere change of the bending manner of a patient's fingers. Therefore, we need a system that can tackle the specificity of each handicapped aphasiac. We propose a system that fulfills this requirement by employing a programmable data glove to capture tiny movement-related finger gestures, an optical signal value-parameterized function to calculate the finger bending degrees, and an automatic regression module to extract most adequate finger features for a specific patient. The selected features are fed into a neural network, which learns to build a finger language recognition model for the specific patient. Then the system can be available for use by the specific user. At the time of this writing, the achieved average success rate was 100% from unbiased field experiments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call