Abstract

The critical need for rapid objective, physiological evaluation of brain function at point-of-care has led to the emergence of brain vital signs—a framework encompassing a portable electroencephalography (EEG) and an automated, quick test protocol. This framework enables access to well-established event-related potential (ERP) markers, which are specific to sensory, attention, and cognitive functions in both healthy and patient populations. However, all our applications to-date have used auditory stimulation, which have highlighted application challenges in persons with hearing impairments (e.g., aging, seniors, dementia). Consequently, it has become important to translate brain vital signs into a visual sensory modality. Therefore, the objectives of this study were to: 1) demonstrate the feasibility of visual brain vital signs; and 2) compare and normalize results from visual and auditory brain vital signs. Data were collected from 34 healthy adults (33 ± 13 years) using a 64-channel EEG system. Visual and auditory sequences were kept as comparable as possible to elicit the N100, P300, and N400 responses. Visual brain vital signs were elicited successfully for all three responses across the group (N100: F = 29.8380, p < 0.001; P300: F = 138.8442, p < 0.0001; N400: F = 6.8476, p = 0.01). Initial auditory-visual comparisons across the three components showed attention processing (P300) was found to be the most transferrable across modalities, with no group-level differences and correlated peak amplitudes (rho = 0.7, p = 0.0001) across individuals. Auditory P300 latencies were shorter than visual (p < 0.0001) but normalization and correlation (r = 0.5, p = 0.0033) implied a potential systematic difference across modalities. Reduced auditory N400 amplitudes compared to visual (p = 0.0061) paired with normalization and correlation across individuals (r = 0.6, p = 0.0012), also revealed potential systematic modality differences between reading and listening language comprehension. This study provides an initial understanding of the relationship between the visual and auditory sequences, while importantly establishing a visual sequence within the brain vital signs framework. With both auditory and visual stimulation capabilities available, it is possible to broaden applications across the lifespan.

Highlights

  • There is an increasing need for objective, neurophysiological measures, such as EEG, to provide unbiased measures of brain function across a range of different pointsof-care

  • As an initial validity check, the results demonstrated that the targeted event-related potentials (ERPs) (N100, P300, and N400) were evoked and detectable by comparing mean amplitudes for each stimulus conditions within each modality at a group-level

  • The current study reinforced the viability of the brain vital sign framework through successful expansion from the auditory to the visual modality

Read more

Summary

Introduction

There is an increasing need for objective, neurophysiological measures, such as EEG, to provide unbiased measures of brain function across a range of different pointsof-care. The translation of EEG/ERP research into neurophysiological assessment applications compatible with the clinical environment has been demonstrated with rapid non-invasive implementations, such as the Halifax Consciousness Scanner (HCS; D’Arcy et al, 2011) and more recently in the brain vital signs framework (Ghosh-Hajra et al, 2016a). A results report is generated based on normalized ERP characteristics This has been validated in large samples of healthy individuals by reliably eliciting the targeted ERPs across individuals (Ghosh-Hajra et al, 2016a). Changes in these targeted ERPs have been observed in patients with acquired brain injuries (Fleck-Prediger et al, 2014) and athletes with concussions (Fickling et al, 2018)

Objectives
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call