Abstract

Over the past decade, the Human–Computer Interaction (HCI) Lab at Tufts University has been developing real-time, implicit Brain–Computer Interfaces (BCIs) using functional near-infrared spectroscopy (fNIRS). This paper reviews the work of the lab; we explore how we have used fNIRS to develop BCIs that are based on a variety of human states, including cognitive workload, multitasking, musical learning applications, and preference detection. Our work indicates that fNIRS is a robust tool for the classification of brain-states in real-time, which can provide programmers with useful information to develop interfaces that are more intuitive and beneficial for the user than are currently possible given today’s human-input (e.g., mouse and keyboard).

Highlights

  • Over the past decade, the HCI (Human–Computer Interaction) laboratory at Tufts University, helmed by Dr Robert Jacob, has investigated the application of functional near-infrared spectroscopy (fNIRS) data as input to dynamic brain–computer interfaces (BCIs)

  • Our work has demonstrated that fNIRS can be used as an effective tool to identify a variety of cognitive responses in real-time, and that this capability can be used to develop and evaluate user interfaces based on fNIRS signals

  • UAVs under the adaptive condition, and 1.37 neglected UAVs under the control condition. We interpret these results to mean that in the adaptive condition, participants were more aware of obstacles and attentive of UAVs. The results of this experiment allow us to conclude that brain computer interfaces such as fNIRS can aid in programmatic task allocation wherein a user’s cognitive workload state is dynamically matched with a task based on task complexity or difficulty

Read more

Summary

Introduction

The HCI (Human–Computer Interaction) laboratory at Tufts University, helmed by Dr Robert Jacob, has investigated the application of fNIRS (functional near infrared spectroscopy) data as input to dynamic brain–computer interfaces (BCIs). Improvements in technology in fNIRS measurements and in real-time machine learning data analysis have made possible a new generation of passive or implicit brain–computer interfaces (BCIs), which have a different goal and target audience from conventional BCI. Most previous work in BCI has focused on severely paralyzed users. BCI for such people usually requires them to explicitly produce patterns of measurable brain signals, for example by imagining that they are moving their toe or that they are saying a particular word. The resulting user interfaces may be slow and awkward they are life-changing for those patients with “locked in” syndrome [1]. This review focuses on research conducted in the Tufts

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call