Abstract

Simple SummaryIn this work, we explore self-supervised pretraining for gait recognition. We gather the largest dataset to date of real-world gait sequences automatically annotated through pose tracking (UWG), which offers realistic confounding factors as opposed to current datasets. Results highlight the great performance in scenarios with low amounts of training data, and state-of-the-art accuracy on skeleton-based gait recognition when utilizing all available training data.The use of gait for person identification has important advantages such as being non-invasive, unobtrusive, not requiring cooperation and being less likely to be obscured compared to other biometrics. Existing methods for gait recognition require cooperative gait scenarios, in which a single person is walking multiple times in a straight line in front of a camera. We address the challenges of real-world scenarios in which camera feeds capture multiple people, who in most cases pass in front of the camera only once. We address privacy concerns by using only motion information of walking individuals, with no identifiable appearance-based information. As such, we propose a self-supervised learning framework, WildGait, which consists of pre-training a Spatio-Temporal Graph Convolutional Network on a large number of automatically annotated skeleton sequences obtained from raw, real-world surveillance streams to learn useful gait signatures. We collected and compiled the largest pretraining dataset to date of anonymized walking skeletons called Uncooperative Wild Gait, containing over 38k tracklets of anonymized walking 2D skeletons. We make the dataset available to the research community. Our results surpass the current state-of-the-art pose-based gait recognition solutions. Our proposed method is reliable in training gait recognition methods in unconstrained environments, especially in settings with scarce amounts of annotated data.

Highlights

  • Human behaviour is complex, it defines us and is driven, in part, by the environment, social influences, life experiences, and internal emotional factors, attitudes and values

  • We propose the Unconstrained Wild Gait dataset (UWG), which unlike current available datasets, contains anonymized skeleton sequences of a large number of people walking in a natural environment, with many walking variations and confounding factors—the data is gathered from raw, real-world video streams (Figure 1)

  • Each identity has three walking variations corresponding to normal walking (NM), clothing variation (CL) and carry bag (BG)

Read more

Summary

Introduction

It defines us and is driven, in part, by the environment, social influences, life experiences, and internal emotional factors, attitudes and values. Walking patterns can be used to estimate the age and gender of a person [1], estimate emotions [2], and provide insight into various physiological conditions [3]. Aside from these soft-biometrics, gait information is used as a unique fingerprinting method for identifying individuals. The intrinsic dynamic nature of walking makes it susceptible to a multitude of confounding factors such as view angle, shoes and clothing, carrying variations, age, interactions with other people and various actions that the person is performing while walking

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call