Abstract

Brain–computer interfaces (BCI) have been focused on improving people’s lifestyles with motor or communication disabilities. However, the utilization of this technology has found news applications, such as increasing human capacities. Nowadays, several researchers are working on probing human capabilities to control several robotic devices simultaneously. The design of BCI is an intricate work that needs a long time to its implementation. For this reason, an architecture to design and implement different types of BCIs is presented in this article. The architecture has a modular design capable of reading various electroencephalography (EEG) sensors and controlling several robotic devices similar to the plug-and-play paradigm. To test the proposed architecture, a BCI was able to manage a hexapod robot and a drone was implemented. Firstly, a mobile robotic platform was designed and implemented. The BCI is based on eye blinking, where a single blinking represents a robot command. The command orders the robot to initiate or stops their locomotion for the hexapod robot. For the drone, a blink represents the takeoff or landing order. The blinking signals are obtained from the prefrontal and frontal regions of the head by EEG sensors. The signals are then filtered using temporal filters, with cutoff frequencies based on delta, theta, alpha, and beta waves. The filtered signals were labeled and used to train a classifier based on the multilayer perceptron (MLP) model. To generate the robot command, the proposal BCI used two models of MLP to ensure the classifier prediction. So, when the two classifiers make the same prediction, within a defined time interval, send the signal to the robot to start or stop its movement. The obtained results show that it is possible to get high precision to control the hexapod robot with a precision of 91.7% and an average of 81.4%.

Highlights

  • Nowadays, technological development has played a significant role in our everyday lives, where comfortableness and improving the quality of life play an essential role

  • We focused on the work related to the Brain–computer interfaces (BCI) to control different devices

  • The classifiers have to be trained with the EEG signals previously captured

Read more

Summary

Introduction

Technological development has played a significant role in our everyday lives, where comfortableness and improving the quality of life play an essential role. The development of moving robots has been an answer to this need. The first ones are smart machines capable of doing tasks in their surroundings without humans’ explicit control. Such robots have to be able to develop themselves in entirely unknown environments.[1] On the other hand, the nonautonomous robots need to be operated, through a control interface, by a human to do the task.[2] These interfaces are denominated man–machine and allow communication among humans with different machines or robots. The man–machine interfaces can be of various types, such as graphics or remote controls, but both usually use a series of commands to define the activities to be carried out.[3] Currently, the computer systems need the model design of the interfaces adapted to the individual and social needs of the user.[4]

Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.