Abstract
Unlike traditional remote control systems for controlling unmanned aerial vehicles (UAVs) and drones, active research is being carried out in the domain of vision-based hand gesture recognition systems for drone control. However, contrary to static and sensor based hand gesture recognition, recognizing dynamic hand gestures is challenging due to the complex nature of multi-dimensional hand gesture data, present in 2D images. In a real-time application scenario, performance and safety is crucial. Therefore we propose a hybrid lightweight dynamic hand gesture recognition system and a 3D simulator based drone control environment for live simulation. We used transfer learning-based computer vision techniques to detect dynamic hand gestures in real-time. The gestures are recognized, based on which predetermine commands are selected and sent to a drone simulation environment that operates on a different computer via socket connectivity. Without conventional input devices, hand gesture detection integrated with the virtual environment offers a user-friendly and immersive way to control drone motions, improving user interaction. Through a variety of test situations, the efficacy of this technique is illustrated, highlighting its potential uses in remote-control systems, gaming, and training. The system is tested and evaluated in real-time, outperforming state-of-the-art methods. The code utilized in this study are publicly accessible. Further details can be found in the “Data Availability Statement”.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have