Abstract
Cyborg intelligence has been devoted to enhancing the physical abilities of humans by integrating artificial intelligence (AI) with in-the-body technologies and biological behaviors. In this regard, Deep Learning (DL) based gait recognition has emerged as an unobtrusive subfield of cyborg intelligence for the purpose of human identification and authentication. Real-world gait data are usually aggregated freely without knowing the time, place, or style of human walks implying many uncertainties that negatively impact the performance of DL models. This study presents a new fuzzy-based temporal convolutional autoencoder framework (termed FTCAE), which is created for gait recognition from inertial gait time series. The former part is introduced to bring together the capability of an autoencoder and temporal convolutions to automatically extract valuable information from the inertial data representing complex and dynamic gait patterns. Besides, a novel interval type-2 fuzzy set (IT2FS) dense layer is introduced to handle uncertainties, imprecisions, and noises of feature maps, hence enabling the learning of curious representations in fuzzy latent space. The IT2FS introduces a local feedback mechanism to empower the network capabilities for modeling uncertainty in temporal dependencies in human gait data. Proof of concept experimentation on public gait sensory datasets validates the efficiency of the proposed FTCAE with accuracies of 98.48% and 95.11% for authentication and identification, respectively.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.