Abstract

This study considers a brain-machine interface (BMI) system based on the steady state visually evoked potential (SSVEP) for controlling quadcopters using electroencephalography (EEG) signals. An EEG channel with a single dry electrode, i.e., without conductive gel or paste, was utilized to minimize the load on users. Convolutional neural network (CNN) and long short-term memory (LSTM) models, both of which have received significant research attention, were used to classify the EEG data obtained for flickers from multi-flicker screens at five different frequencies, with each flicker corresponding to a drone movement, viz., takeoff, forward and sideways movements, and landing. The subjects of the experiment were seven healthy men. Results indicate a high accuracy of 97% with the LSTM model for a 2 s segment used as the unit of processing. High accuracy of 93% for 0.5 s segment as a unit of processing can remain in the LSTM classification, consequently decreasing the delay of the system that may be required for safety reasons in real-time applications. A system demonstration was undertaken with 2 out of 7 subjects controlling the quadcopter and monitoring movements such as takeoff, forward motion, and landing, which showed a success rate of 90% on average.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call