Abstract

Recent developments in sensors increased the importance of action recognition. Generally, the previous studies were based on the assumption that the complex actions can be recognized by more features. Therefore, generally more than required body-worn sensor types and sensor nodes were used by the researchers. On the other hand, this assumption leads many drawbacks, such as computational complexity, storage and communication requirements. The main aim of this paper is to investigate the applicability of recognizing the actions without degrading the accuracy with less number of sensors by using a more sophisticated feature extraction and classification method. Since, human activities are complex and include variable temporal information in nature, in this study one-dimensional local binary pattern, which is sensitive to local changes, and the grey relational analysis, which can successfully classify incomplete or insufficient datasets, were employed for feature extraction and classification purposes, respectively. Achieved mean classification accuracies by the proposed approach are 95.69, 98.88, and 99.08 % while utilizing all data, data obtained from a sensor node attached to left calf and data obtained from only 3D gyro sensors, respectively. Furthermore, the results of this study showed that the accuracy obtained by using only a 3D acceleration sensor attached in the left calf, 98.8 %, is higher than accuracy obtained by using all sensor nodes, 95.69 %, and reported accuracies in the previous studies that made use of the same dataset. This result highlighted that the position and type of sensors are much more important than the number of utilized sensors.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.