Abstract

Drones have now found their way into diverse fields of application. Getting accustomed to the presence of these drones requires seamless integration into our lives and depends a lot on Human-Drone Interaction (HDI). This paper presents a novel approach to develop a non-invasive Brain-Computer Interface (BCI) for control of the quadcopters as an assistive device (AD) for those people suffering from neurodegenerative diseases or impaired mobility and have lost the ability to explore the world around them freely. Electroencephalography (EEG) signals of individuals are captured, and actions corresponding to the wearer’s thoughts were classified and used to control a quadcopter. Our main contribution lies in modeling this problem as a Markovian process thus enabling us to maximize the accuracy of data post-classification especially since the data may still contain outliers due to the inability of the user to maintain a constant thought stream, sensor noise, or classification errors. We propose an algorithm that aims at achieving robust control by breaking the problem into sub-parts, which are- classification, outlier removal, maximum likelihood estimation, and autonomy, where each step is optimized individually. We also present a shared control algorithm incorporating visual feedback by three-dimensional reconstruction of the environment to augment user’s decisions with autonomous obstacle avoidance. The entire algorithm of this BCI is built using the Robot Operating System (ROS) framework. Our results suggest that post-processing classified data improves accuracy and system reaction time with minimal detriment to computational efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call