Abstract

In this paper, we introduce a framework for dynamic gesture recognition with background suppression operating on the output of a moving event-based camera. The system is developed to operate in real-time using only the computational capabilities of a mobile phone. It introduces a new development around the concept of time-surfaces. It also presents a novel event-based methodology to dynamically remove backgrounds that uses the high temporal resolution properties of event-based cameras. To our knowledge, this is the first Android event-based framework for vision-based recognition of dynamic gestures running on a smartphone without off-board processing. We assess the performances by considering several scenarios in both indoors and outdoors, for static and dynamic conditions, in uncontrolled lighting conditions. We also introduce a new event-based dataset for gesture recognition with static and dynamic backgrounds (made publicly available). The set of gestures has been selected following a clinical trial to allow human-machine interaction for the visually impaired and older adults. We finally report comparisons with prior work that addressed event-based gesture recognition reporting comparable results, without the use of advanced classification techniques nor power greedy hardware.

Highlights

  • This article focuses on the problem of gesture recognition and dynamic background suppression using the output of a neuromorphic asynchronous event-based camera (Figure 1) connected to a mobile phone (Maro et al, 2019)

  • We address the difficult problem of dynamic background suppression by introducing a novel low power event-based technique operating in the temporal domain

  • The only visionbased dynamic gesture recognition method for smartphone we found is proposed by Rao and Kishore (2016)

Read more

Summary

Introduction

This article focuses on the problem of gesture recognition and dynamic background suppression using the output of a neuromorphic asynchronous event-based camera (Figure 1) connected to a mobile phone (Maro et al, 2019). We chose the popular task of vision-based gesture recognition and dynamic background suppression These are good targets to make use of the dynamic properties of event-based sensors. We introduce a framework for dynamic gesture recognition with background suppression operating on the output of a moving event-based camera. It introduces a new development around the concept of time-surfaces It presents a novel event-based methodology to dynamically remove backgrounds that uses the high temporal resolution properties of event-based cameras. To our knowledge, this is the first Android event-based framework for vision-based recognition of dynamic gestures running on a smartphone without off-board processing. The NavGesture dataset is publicly available at https://www.neuromorphic-vision. com/public/downloads/navgesture/

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call