Abstract

Studying human motion requires modelling its multiple temporal scale nature to fully describe its complexity since different muscles are activated and coordinated by the brain at different temporal scales in a complex cognitive process. Nevertheless, current approaches are not able to address this requirement properly, and are based on oversimplified models with obvious limitations. Data-driven methods represent a viable tool to address these limitations. Nevertheless, shallow data-driven models, while achieving reasonably good recognition performance, require to handcraft features based on domain-specific knowledge which, in this cases, is limited and does no allow to properly model motion- and subject-specific temporal scales. In this work, we propose a new deep multiple temporal scale data-driven model, based on Temporal Convolutional Networks, able to automatically learn features from the data at different temporal scales. Our proposal focuses first on over-performing state-of-the-art shallows and deep models in terms of recognition performance. Then, thanks to the use of feature ranking for shallow models and an attention map for deep models, we will give insights on what the different architectures actually learned from the data. We designed, collected data, and tested our proposal in custom experiment of motion recognition: detecting the person who draw a particular shape (i.e., an ellipse) on a graphics tablet, collecting data about his/her movement (e.g., pressure and speed) in different extrapolating scenarios (e.g., training with data collected from one hand and testing the model on the other one). Collected data regarding our experiment and code of the methods are also made freely available to the research community. Results, both in terms of accuracy and insight on the cognitive problem, support the proposal and support the use of the proposed technique as a support tool for better understanding the human movements and its multiple temporal scale nature.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.