Abstract

We previously developed a gesture interface for people with motor dysfunction using an RGB-D camera. We collected 226 gesture data from 58 individuals with motor dysfunction and classified the data. We then developed multiple recognition modules based on the data. The interface has nine modules for recognizing various types of gestures. For this study, we had a person with a disability use this interface in combination with an input device he had been using trackball for a transcription task. We set two gesture-input switches from the movement of two sites on his body that were easy for him to move. The user performed character input in the on-screen keyboard by using the trackball and separately operated the sound player using our gesture interface. He continued this activity using this combination daily use for half a year. He was able to reduce the input time by half. We are now supplying AAGI for Japanese people with motor dysfunction freely. We will supply AAGI for foreign users through our home page in next year.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.