This paper presents a hand gesture based control of an omnidirectional wheelchair using inertial measurement unit (IMU) and myoelectric units as wearable sensors. Seven common gestures are recognized and classified using shape based feature extraction and Dendogram Support Vector Machine (DSVM) classifier. The dynamic gestures are mapped to the omnidirectional motion commands to navigate the wheelchair. A single IMU is used to measure the wrist tilt angle and acceleration in three axis. EMG signals are extracted from two forearm muscles namely Extensor Carpi Radialis and Flexor Carpi Radialis and processed to provide Root Mean Square (RMS) signal. Initiation and termination of dynamic activities are based on autonomous identification of static to dynamic or dynamic to static transition by setting static thresholds on processed IMU and myoelectric sensor data. Classification involves recognizing the activity pattern based on periodic shape of trajectories of the triaxial wrist tilt angle and EMG-RMS from the two selected muscles. Second order Polynomial coefficients extracted from the sensor trajectory templates during specific dynamic activity cycles are used as features to classify dynamic activities. Classification algorithm and real time navigation of the wheelchair using the proposed algorithm has been tested by five healthy subjects. Classification accuracy of 94% was achieved by DSVM classifier on ‘k’ fold cross validation data of 5 users. Classification accuracy while operating the wheelchair was 90.5%.
Read full abstract