Abstract

Using P300-based brain–computer interfaces (BCIs) in daily life should take into account the user’s emotional state because various emotional conditions are likely to influence event-related potentials (ERPs) and consequently the performance of P300-based BCIs. This study aimed at investigating whether external emotional stimuli affect the performance of a P300-based BCI, particularly built for controlling home appliances. We presented a set of emotional auditory stimuli to subjects, which had been selected for each subject based on individual valence scores evaluated a priori, while they were controlling an electric light device using a P300-based BCI. There were four conditions regarding the auditory stimuli, including high valence, low valence, noise, and no sound. As a result, subjects controlled the electric light device using the BCI in real time with a mean accuracy of 88.14%. The overall accuracy and P300 features over most EEG channels did not show a significant difference between the four auditory conditions (p > 0.05). When we measured emotional states using frontal alpha asymmetry (FAA) and compared FAA across the auditory conditions, we also found no significant difference (p > 0.05). Our results suggest that there is no clear evidence to support a hypothesis that external emotional stimuli influence the P300-based BCI performance or the P300 features while people are controlling devices using the BCI in real time. This study may provide useful information for those who are concerned with the implementation of a P300-based BCI in practice.

Highlights

  • A brain–computer interface (BCI) provides a direct communication channel between people and external environments without any involvement of muscles by translating brain signals directly into the commands (Wolpaw et al, 2000, 2002)

  • In order to examine the peak amplitude level at F3, the peak amplitude was compared between the target and non-target stimuli, and a paired t-test showed no significant difference for all conditions (HV: p = 0.35, low valence (LV): p = 0.27, noise sound presentation (Noise): 0.21, None: p = 0.26)

  • The large difference (LD) group showed a significant difference in the peak amplitude only at channel O1 between the high valence (HV) and None conditions (HV < None, p = 0.02), while it showed no difference in the peak latency

Read more

Summary

Introduction

A brain–computer interface (BCI) provides a direct communication channel between people and external environments without any involvement of muscles by translating brain signals directly into the commands (Wolpaw et al, 2000, 2002). A P300-based BCI implements an oddball task with the visual arrangement of letters in a matrix form and enables one to select and type a letter using brain activity only (Farwell and Donchin, 1988). It has been further expanded for device control by selecting a target function amid available control functions using brain activity (Aloise et al, 2010; Carabalona et al, 2010; Corralejo et al, 2014; Halder et al, 2015; Miralles et al, 2015; Schettini et al, 2015; Pinegger et al, 2016; Zhang et al, 2017). This type of BCI, potentially combined with the Internet of things (IoT), is especially useful for those with severe neurological disorders to operate living goods such as home appliances (Aydin et al, 2016; Zhong et al, 2019)

Objectives
Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.