Abstract

Unmanned aerial vehicles (UAVs) play an important role in numerous technical and scientific fields, especially in wilderness rescue. This paper carries out work on real-time UAV human detection and recognition of body and hand rescue gestures. We use body-featuring solutions to establish biometric communications, like yolo3-tiny for human detection. When the presence of a person is detected, the system will enter the gesture recognition phase, where the user and the drone can communicate briefly and effectively, avoiding the drawbacks of speech communication. A data-set of ten body rescue gestures (i.e., Kick, Punch, Squat, Stand, Attention, Cancel, Walk, Sit, Direction, and PhoneCall) has been created by a UAV on-board camera. The two most important gestures are the novel dynamic Attention and Cancel which represent the set and reset functions respectively. When the rescue gesture of the human body is recognized as Attention, the drone will gradually approach the user with a larger resolution for hand gesture recognition. The system achieves 99.80% accuracy on testing data in body gesture data-set and 94.71% accuracy on testing data in hand gesture data-set by using the deep learning method. Experiments conducted on real-time UAV cameras confirm our solution can achieve our expected UAV rescue purpose.

Highlights

  • Six people from our lab participated in unmanned aerial vehicles (UAVs) body rescue gesture data collection and real-time prediction, the genders were four males and two females, aged btw 22 and 32 years old

  • The input to the system is the live video captured by the drone‘s camera and the process is as follows: in the first step human detection is performed and when a person is detected by the drone, the system proceeds to the step of rescue gesture recognition

  • Feedback from the human is crucial to the UAV rescue

Read more

Summary

Introduction

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. A data-set of ten body rescue gestures (i.e., Kick, Punch, Squat, Stand, Attention, Cancel, Walk, Sit, Direction, and PhoneCall) has been created by a UAV on-board camera. The two most important dynamic gestures are the novel dynamic Attention and Cancel which represent the set and reset functions respectively We use this newly created dataset (detailed in Section 2.2) and the hand gesture dataset (detailed in Section 2.3) for human gesture recognition, combining from overall body to local hand gestures for better rescue results. A dataset of ten basic body rescue gestures (i.e., Kick, Punch, Squat, Stand, Attention, Cancel, Walk, Sit, Direction, and PhoneCall) has been created by a UAV’s camera, which is used to describe some of the body gestures of humans in a wilderness environment.

Machine Specification and UAV Connection
GHz 64-bit quad-core ARMv8 CPU
Body Gesture Data-Set Collection
Hand Gesture Data-Set Collection
Methodology
Human Detection
Workflow
Hand Gesture Recognition total training dataset is split into two sets
Experiment
Findings
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.