Abstract

In this paper, an accelerometer and gyroscope are used to sense gesture commands, which are then classified using a logistic regression model. Seven gestures were chosen and mapped to specific behaviors that a fixed wing unmanned air vehicle could accomplish. These behaviors specified various searching, following, and tracking patterns that could be used in a dynamic environment. The system was trained to recognize the seven gestures and then tested in a hardware-in-the-loop simulation. The system was able to identify all gestures with an overall accuracy of 90% and with five of the seven gestures being accurately identified at least 94% of the time. Each of the behaviors associated with the gestures was tested in simulation and the ability to dynamically switch between behaviors was proven. The results show that the system can be used as a natural interface to assist an operator in directing an unmanned air vehicle’s behaviors.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call