Abstract

Brain–computer interfaces (BCIs) measure brain waves to assist communication and operation of equipment. However, for application of BCIs in real life settings, portability and environmental adaptability are crucial. For BCIs using visual stimuli, portability can be achieved by using augmented reality (AR) with head‐mounted displays, but prior placement of markers in the physical space is required for displaying appropriate choices. In this study, we demonstrate an environmentally adaptable AR‐BCI that uses machine learning and depth sensors. The virtual marker that the user is focusing on through the transmissive AR display is estimated from the user's electroencephalograms (EEG), and preliminary accuracy exceeds chance level. These results suggest that the use of machine learning and AR head‐mounted displays can increase the adaptability of BCIs. © 2022 Institute of Electrical Engineers of Japan. Published by Wiley Periodicals LLC.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call