Abstract

Brain-computer interface (BCI) also referred to as Brain-machine interface (BMI) is a buzz word in the world of neuroscience, translating the human brain thoughts into the chip. These devices may be surgically implanted or placed externally. Such components allow the user to control the actuators or sense the input data through bilateral communication to achieve the task. Most of the current applications focus on neural-prosthetics, artificial limbs, cochlear implants, and assistive devices for people affected by neurological disorders such as Alzheimer’s, Parkinson’s and more. Initially, it was developed to assist persons with neurological disorders. Due to the evolution of noninvasive imaging components, BCI is being extended for public communication like the brain–brain interfacing. The implementation of BCI on neuromorphic hardware components would further improve the computational complexity, execution speed, energy efficiency, and robustness against local failure. Machine learning and deep learning algorithms are contributing to computer vision, speech recognition, game control, autonomous vehicle systems, disease classification/prediction, and many more. Though BCI has improved the lifestyle of the end-users, the responsiveness of those devices is not alike natural elements. Hence, to create an effective pathway from a brain to the external world through mapping, augmenting, assisting and troubleshooting, many computational intelligence methods have been proposed. In this chapter, the translation of brain waves into features and further classified to control any applications in an open/closed environment with a secure mechanism along with adaptive learning algorithms will be discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call