Abstract
The wide range of usage and application of wearable sensors like as smart watches provide access to precious inertial sensor data that is usable in human identification based on their gait pattern. A large number of studies have been conduced on extracting high-level and various heuristic features out of inertial sensor data to identify discriminative gait signatures and distinguish the target individual from others. However, complexity of the collected data from inertial sensors, detachment between the predictive learning models and intuitive feature extraction module increase the error rate of manual feature extraction. We propose a new method for the task of human gait identification based on spectro-temporal two dimensional expansion of gait cycle. Then, we design a deep convolutional neural network learning in order to extract discriminative features from the two dimensional expanded gait cycles and also jointly optimize the identification model simultaneously. We propose a systematic approach for processing nonstationary motion signals with the application of human gait identification with 3 main elements: first gait cycle extraction, second spectro-temporal representation of gait cycle, and third deep convolutional learning. We collect motion signal from 5 inertial sensors placed at different locations including lower-back, chest, right knee, right ankle, and right hand wrist. We pre-process the acquired raw signals by motion signal processing and then we propose an efficient heuristic segmentation methodology and extract gait cycle from the segmented and processed data. Spectro-temporal two dimensional features are extracted by merging key instantaneous temporal and spectral descriptors in a gait cycle which is capable of characterizing the non-stationarities in each gait cycle inertial data. The two dimensional time-frequency distribution representation of the gait cycle extracted from acquired inertial sensor data from 10 subjects are fed as input to the designed and proposed 10 layers DCNN architecture. Based on our experimental analysis, 93.36% accuracy was achieved for subject identification task.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.