Abstract

Activity and gesture recognition from body-worn acceleration sensors is an important application in body area sensor networks. The key to any such recognition task are discriminative and variation tolerant features. Furthermore good features may reduce the energy requirements of the sensor network as well as increase the robustness of the activity recognition. We propose a feature extraction method based on genetic programming. We benchmark this method using two datasets and compare the results to a feature selection which is typically used for obtaining a set of features. With one extracted feature we achieve an accuracy of 73.4% on a fitness activity dataset, in contrast to 70.1% using one selected standard feature. In a gesture based HCI dataset we achieved 95.0% accuracy with one extracted feature. A selection of up to five standard features achieved 90.6% accuracy in the same setting. On the HCI dataset we also evaluated the robustness of extracted features to sensor displacement which is a common problem in movement based activity and gesture recognition. With one extracted features we achieved an accuracy of 85.0% on a displaced sensor position. With the best selection of standard features we achieved 55.2% accuracy. The results show that our proposed genetic programming feature extraction method is superior to a feature selection based on standard features.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.